Short Tail Or Long Tail Keywords? — A Side-by-Side Comparison

Posted by on Jul 27, 2018 in SEO Articles | Comments Off on Short Tail Or Long Tail Keywords? — A Side-by-Side Comparison


Long Tail vs Short Tail Keywords

Image credits: Aida Blakely

When it comes to on-page SEO, keywords are the biggest factor in determining your SEO success or failure. When deciding which keywords to use, you’ll need to do some homework and research before deciding which ones to go for. One question that often arise while doing keyword research remains:

Short Tail or Long Tail Keywords?

The discussion has been raging and the resulting ends are something that anyone doing marketing on the Internet should be made aware of.

What Are Short Tail And Long Tail Keywords?
Short Tail Keywords

Short tail keywords are 3 words or less. Examples include: “athletic apparel,” “DVD player,” or “engagement ring.” Short tail keywords are also known as “head terms”. They may be the first thing you think of when you are deciding where to go to eat (“Chinese food,” “pizza delivery”), what to do (“dance clubs,” “roller coaster park”), or where to worship (“synagogue,” “Catholic church”).

Long Tail Keywords

Long tail keywords however are a little different compared to short tail keywords. Long tail words are more than 3 words. They are definitely a lot more targeted and not as broad. You may not bring in as much search traffic from long tail keywords but the traffic you do bring in is the kind you are looking for.

Examples of long tail keywords include: “summer women’s athletic apparel,” “super Blu Ray HD DVD combo player,” or “white diamond engagement ring.” Long tail keywords are obviously a lot more specific than short tail keywords; as a marketer this actually can work very much in your favor. But there are pros and cons to both types of keywords.

The Long and Short Of Keywords

Which type of keyword you want to use for your marketing is going to depend very much on the type of traffic that you want to drive to your site.

As keywords get longer, search volume becomes lower. However, all other metrics such as conversion rates go in favor of long tail keywords.

Short Tail Keywords

Short tail keywords have several things working for and against them. For example if you are trying to drive a lot of traffic to your site, you should use short tail keywords. The challenge here is that if your site is “new” or if your search efforts are just beginning you are getting at the back of a very long line.

Volume: High

When it comes to volume, short tail keyword is going to win long tail keyword every time. The shorter the keyword is, the higher the search volume. If you could rank for a short tail keyword, you’re definitely going to get plenty of organic traffic.

Competition: High

Given the high search volumes, it’s no wonder why everyone is trying to rank for short tail keywords, the reward is high. Competition for short tail keywords is highly competitive.

To give you a clearer picture, for pizza searches, you are behind brands like Pizza Hut, Pizza Pizza, Dominos, Papa John’s as well as all the ranking sites, local searches, and the like.

So if you notice the pattern here, it’s clear that unless your company is huge at the international level, it is really tough to get into the first page of Google search results.

Focus: Low

There is also the issue of your search not being “targeted.” People searching for “DVD” may be looking for a player to buy but they also may be looking for a DVD player to rent, a DVD film, a DVD reproduction service, a list of DVD rentals and the like. You are going up against names like Samsung, Sony, and Amazon when you simply search DVD.

Cost: High

Short tail keywords also have a cost factor involved which is going to get expensive. Google AdWords is going to charge you a pretty penny to get into the short tail keyword business for your common search terms. Because so many other people are buying them you are going to need to pay a premium for your presence in these searches.

Conversion Rate: Low

Finally, the thing which irks most people about short tail keyword is the low conversion rate. Say you have a term that’s searched for a lot. Even if you get 10,000 searches and 100 clicks your conversion rate may be one or two customers.

While those one or two customers may be your bread and butter, the truth is that you may have more luck and less noise if you opted for long tail keywords.

Long Tail Keywords

Long tail keywords are like a bear in the forest. They can lie dormant for a while but when they are used they are typically quite deadly. Like the bear in the forest, there isn’t too much else that competes with these keywords as you’ll see.

Volume: Low

When you are talking about long tail keywords you have to appreciate the level of volume. Your volume of traffic from long tail keywords is going to be far less. For some types of businesses, this may be a bad thing.

However, if you have your wares that you are selling and you are trying to cater to your specific customer, you may not want a lot of beady eyes and sweaty breaths clogging up your virtual storefront.

Competition: Low

The competition among others is also lower when you have decided on a long tail keyword. When you have a search term like “RV camper power cord hatch cover” or “baseball card holder sheets” you are getting traffic. The good news for you if you’re selling these things is that there are few others who are selling the same thing. The search is going to be geared towards whatever it is you are selling and the competition for this specific traffic will be low.

Focus: High

Just like the low competition made evident, the targeted nature of the search traffic you get will, for most businesses selling specific things, be ideal. You will pretty well only have people who are looking for “toddler ballerina shoes with ribbon” or “cheap loveseat recliner covers,” coming to your store. That means you’re a lot more likely to have the customers you are looking for, looking for you!

Cost: Low

Another added benefit to the lower traffic long tail keywords is that you are going to pay a lot less for them. Google AdWords has got a reduced price for searches that are specific and contain more terms. This means that you won’t show up nearly as high in general searches (until you become the preeminent name in your industry) but you will have a lot lower cost to get you there.

For small businesses who may have a PPC arrangement, this is going to be huge for your monthly cost. When you are selling more things to fewer customers that will be even better!

Conversion Rates: High

One last point about long tail keywords is the increased conversion rates. If you have people seeking out such specific things as discussed above, you will have a far easier time converting the traffic you generate. These conversion numbers are obviously going go vary depending on what it is you’re selling and where, but the numbers point very strongly in favor of long tail keyword selection.

Moreover, if your customer likes what they see from your long tail conversion selection, even if they don’t buy the first time around they are far more likely to come back to your online store when they do buy simply because your site spoke to their specific needs.

Which Is Better?

To simplify this entire article into a simple table, you’ll see that long tail keywords are the far better choice and rightly so.

At the end of the day, you need to do what is best for you and your small business. Of course you want to save money and you want to have as large a web presence as you possibly can.

At the same time you need to remember that the point of your having a web business isn’t (generally) to get people to click to your site and walk away unsatisfied with the results their search has given them; the idea is that they spend money!

Getting your customers to drop that dime and try out your business is the whole point. If you want to increase your conversion and make that sale then you should be directing traffic to your specific type of widget, whatever that may be.

In this day and age of online searches short tail keywords are very difficult break into the rankings of. As discussed earlier it is almost impossible for small or even medium sized businesses to rank among the big boys.

So if this is not a fight that you are even able to have, why would you want to try?

If you have a huge body of content, a very specific but general item, or strong brand and domain authority then maybe the short tail keywords will still work for you.

However if you are looking for higher conversion rate, lower cost, lower competition, and volume that is specifically after what it is that you are selling, the long tail keyword game is one that you should be in.

This post was originally written by Zhi Yuan and published on Nov 18, 2015. It was most recently updated on July 27, 2018.

Related Links:

How To Decide Which Keywords To Use? — Comprehensive Keyword Research Guide
How To Increase Conversion Rate By 113% Using Retargeting Ads
Inbound Marketing vs Outbound Marketing – Which Is More Effective?

float: left;
margin: 0;
width: 100%;
max-width: 654px;
height: 100%;
#optin-template-3 .container{
float: left;
width: 100%;
height: 100%;
text-align: center;
background: #fff;
border: 0px solid #ea5d3f;
padding-bottom: 16px;
#optin-template-3 .top-row{
display: inline-block;
width: 88%;
padding: 3% 6% 0%;
#optin-template-3 .top-row h2{
margin: 5px 0 0;
font-family: “roboto”, helvetica, sans-serif;
font-weight: 600;
text-align: center;
padding:0px 0px 5px;
#optin-template-3 .left-column{
display: inline-block;
width: 100%;
max-width: 270px;
min-width: 270px;
height: 100%;
vertical-align: top;
padding-top: 32px;
#optin-template-3 .ebook-img{
width: 100%;
height: 280px;
background: url(;
background-size: cover;
#optin-template-3 .right-column{
display: inline-block;
width: 60%;
min-width: 250px;
max-width: 305px;
padding: 24px 4% 32px;
#optin-template-3 .bodycopy ul{
text-align: left;
padding-left: 0;
#optin-template-3 .bodycopy ul li{
font-family: “roboto”, helvetica, sans-serif;
margin-left: 20px;
#optin-template-3 .optIn-form{
display: block;
bottom: 0;
#optin-template-3 .email{
display: block;
width: 100%;
border: 0;
padding: 8px 0;
font-size: 18px;
text-align: center;
border: 1px solid #ea5d3f;
#optin-template-3 .submit-button{
display: block;
margin-top: 4%;
width: 100%;
padding: 8px 0;
font-family: “roboto”, helvetica, sans-serif;
font-weight: 400;
color: #fff;
background: #ea5d3f;
font-size: 21px;
border: 0;
outline: 1px solid #ea5d3f;
cursor: pointer;

Pull A Wave Of Hungry Readers To Your Blog And Gain Profit

A Step-by-Step Guide to Affiliate Marketing
Practical Approach, No BS
Beginner Friendly

Why AI and international paid media is a match made in hell

Posted by on Jul 27, 2018 in SEO Articles | Comments Off on Why AI and international paid media is a match made in hell

When looking back on summer 2018, it’s hard to ignore the optimism that’s been in the air. Sunny weather? Check. England football triumph? Almost! AI as the next big thing in digital marketing? Try and count the number of articles, blog posts and sound bites that you’ve encountered over the last month which cite AI in a hype-tastic way.

Now we’re all for a bit of well-reasoned optimism, and there is no doubt that AI is an extremely powerful toolkit that will positively impact all kinds of socio-economic activity. But we’re not so sure about the true value of AI in the context of digital marketing, and specifically for international paid media.

Back to basics

Cutting through the hype, let’s start by looking at exactly how AI and machine learning work in the context of international paid media. For example, on a keyword level, how much and what kind of data are needed for AI to make a good decision?

Well, Google’s machine learning product Smart Bidding states that it “enables you to tailor bids based on each user’s context. Smart Bidding includes important signals like device, location and remarketing lists for better automation and performance”.

This implies that the signals required by the algorithm can be culled from the sum of users’ behavior, and that its “learning capabilities quickly maximize the accuracy of your bidding models to improve how you optimize the long-tail [by evaluating] patterns in your campaign structure, landing pages, ad text, product information, keyword phrases and many more to identify more relevant similarities across bidding items to effectively borrow learnings between them”.

This suggests that the ‘go to’ source of data is our own campaign. But what are these patterns, how long is ‘quickly’, and how on earth can landing page data would help with bid management?

Staying with bid management as an example, we think it works like this:

Primary data: the algorithm looks back at historic direct interactions with a keyword within a client campaign, and makes a cost/position decision based on pre-defined goals like ROI or CTR, and of enough data.
One way to address a possible data volume problem would be to look back a long way. But this would ignore seasonality, promotions and changes in consumer behaviors over time.
Secondary data – the algorithm has insufficient data to make a ‘good’ decision on the primary basis, so uses corroborative data (performance indicators from other campaigns which have similar characteristics (e.g. same vertical, same language) to make decisions.

Do we even have enough data?

The question is if, aside from very high-volume big category campaigns (think car insurance, credit cards), there is enough primary data to power effective AI decision making. AI needs a huge amount of data to be effective. When IBM’s Deep Blue learned chess, for instance, the developer relied on 5 million data sets. Most industry experts believe that AI’s biggest limitation will be access to high-quality data of enough scale.

We also have no idea what a ‘good’ volume of data looks like. This is even more unlikely for international PPC, where campaigns are often very granular, multi-language, and designed to include lots of long tail keywords (which by definition do not have much volume).

When it comes to secondary data, how relevant can the corroborative data be? For maximum relevance, taking CLIENT X as an example, we’d have to assume that the algorithm is quickly assimilating data from CLIENT X’s direct competitors and using that to better inform the bid management strategy.

Surely that kind of cross-fertilized data would power all auction players’ bid tactics, creating a loop where no player has an advantage?

If competitor data is not used, then what kind of secondary data is sufficiently relevant to power good AI decisions. This would easier if we knew definitively how the rules of the algorithms were constructed, but of course, we never will.

Time for a reality check

To recap, if we knew that 10, 100 or even 1,000 interactions were enough to deliver superior efficiency via AI, we’d be delighted. Campaigns could be planned and executed to use the optimum blend of AI and human capabilities, with best results for ad platforms, agencies and clients. AI could focus on brand and category level interactions, with human oversight and detailed management of long tail.

It seems unlikely that adequate transparency as to how AI actually works, how much data is needed, how the ‘rules’ work, will be forthcoming unless significant changes in business models or practices occur.

Instead, AI is optimistically overhyped as digital’s next big thing while blithely ignoring the basic premise of AI and the current practicalities of both domestic and international digital paid media

What do dolphins eat? Lessons from how kids search

Posted by on Jul 27, 2018 in SEO Articles | Comments Off on What do dolphins eat? Lessons from how kids search


I recently came across a couple of fascinating papers (here and here) all about how kids search. I found it fascinating in its own right, and also found it thought-provoking in the new ways of searching it showed that had simply never occurred to me. Here are some of the most interesting things I found (though it’s remarkably accessible, and you should totally read the whole thing).

The researchers studied children aged 7-11, and of varying degrees of experience and comfort with the web and with computer-based research. In the course of their study, they identified seven “search roles” (almost like personas) that children display when seeking information:

Many of these are fairly self-explanatory on the surface (though it’s always interesting to read the details) and you may even identify with some of them yourself, as an adult. One of the most interesting to me was what they called the visual searcher.

People don’t all think like you

This was a mode of search that I had rarely found myself in, and had barely even considered could be a thing outside of certain forms of specific image search (e.g. [microsoft logo]). What they found was a cohort of children who turned first to image search for a wide range of their information-gathering needs. In some cases, this appeared to be motivated by discomfort with text and with reading, or at least with scanning and reading fast. In others, though, it seemed to be about veracity and trusting only what you have seen with your own eyes. For those of us who know people who write on the internet, maybe this isn’t the craziest instinct.

One example that has stayed in my mind since I read about it is the experience of certain kids when asked to answer the question what do dolphins eat?

The anecdote that stood out for me was the child who not only turned to image search to answer the question, but did the one-word image search [dolphin] and then scrolled down through pages of results until, having found a picture of a dolphin eating something, turned to the researcher to declare triumphantly that dolphins eat fish.

The lesson here is clearly about the power of observing real-world users. This is the kind of insight that is hard to glean from the raw data of keyword research. Even if you figure out that there is image search volume for [dolphin], you’re some way from the insight that someone is searching for information about what they eat.

This era (the research was published in 2010) was marked by a wide range of qualitative research coming out of Google. I might dive deeper into some other research in another post, but for now, onto the next insight.

There are searches that are hard, and people are failing to complete them

In my presentation and post the next trillion searches, I talked about the incremental search volume available in the coming years as technology progresses to the point that it can satisfy intents, and answer questions that current technology cannot:

One of the things I didn’t talk about in that post was the times that current searcher intent is not fulfilled even though the information is out there and today’s technology is more than capable of finding it. To understand more about what I mean here, let’s take another look at search challenges for kids:

For a start, it’s worth noting that Google can’t answer this query outright. Unlike with more and more factual queries, Google is not able to return a one-box with any answer, never mind the correct answer.

Unsurprisingly, kids struggled with this one (as I suspect would many adults). It tests their ability to string together a sequence of queries, each one building on the last, to discover the answer at the end of the rainbow. And along the way, they have to be sceptical of the information they come across and not get distracted by the pots of fools’ gold:

At certain points along the way, our intrepid searcher may come across pages that purport to give the answer, but which in fact do not for a variety of reasons (not least, as with the example above, that this information can fall easily out of date).

So it combines the ability to break down a question into structured thoughts, achieve complex stringing together of queries, and avoid pitfalls of incorrect and misleading information along the way. How many adults do you know who might trip up on this?

Amazingly, some of the older kids in the study managed to find the correct answer.

If you have kids in your life, try this out

If you have kids, or you have younger siblings, cousins, nieces, nephews, etc. I’d strongly encourage anyone interested in search to sit and watch them take on relatively undirected searching tasks while you watch. I think it’s pretty educational (for them!), but I also think there’s a good chance you will learn a good deal. In particular, since this research was done in 2010, it appears to have been entirely desktop-driven. I’d be interested in the mobile-first version if anyone wants to run it and write it up!

Anyway, it turns out my kids are (roughly) in the right age range – at the time of experimenting, my daughter was just turned 8, and my son was 5. My daughter was therefore in the age range, and it was interesting to see how she fared:

Rachel aged 8

She found it fairly easy to find out what dolphins eat. Google coped fine with her misspelling of “dolfin” and she wasn’t fazed by the results coming back for the correct spelling. She didn’t bother reading the “showing results for…” section (nor the paid ad, natch) and skipped straight to the one-box. She scanned it without reading aloud and then answered the question: telling me some things dolphins eat. In the process she went from an unmotivated searcher to a motivated searcher: she got intrigued by what a cephalopod is (it is mentioned in the one-box) and set of on an unprompted search to find out.

The next task was too much for her. She’s British, so I decided to go with prime minister, as I didn’t think she’d know what or who the vice president was. It turns out she wasn’t entirely clear on what a prime minister is either, searching for primeinister. She composed a search that could have worked as a stand-alone query: Google corrected it to [when is the prime minister’s birthday next year]. In fact, Google couldn’t answer this directly, and since it wasn’t quite the actual answer to the question as asked, she got stuck at this point, unable to structure the query quite how she wanted it.

Actually, she probably went slightly too far in the first jump. She probably should have gone with something like [when is the prime minister’s birthday] and followed with [what day is <date> next year] but she didn’t make that logical leap unprompted.

Even though my son was a little young, we thought it’d be fun to see how he fared on the “dolphin” question. The date one was a little too much of a stretch:

Adam aged 5

Interestingly, he spelled “dolfin” the same way as his sister (this must be our failing as parents!) but also went with the phonetic “wat” instead of “what”. Nonetheless, Google was quite happy interpreting his search as [what do dolphins eat] so he got the same one-box as his sister.

Just like her, he skipped everything else on the page to go straight to the one-box. This is probably not that surprising in either of their cases – it’s most likely what adults do, and it’s clearly designed to draw attention with the bright image high up on the page.

What was interesting and different was that he didn’t read the whole thing. At the time of the experiment, he was obviously a less confident reader, and preferred to read aloud rather than in his head. He didn’t scan the one-box for the answer and report it, but interestingly, nor did he read the one-box aloud. Instead, he read only the words in bold.

This isn’t the most obviously crazy strategy (at least in the mind of a 5 year old): it isn’t crazy to think that Google would have bolded the words that are the answers to the question you asked, though search professionals know that’s not what’s really going on here. It started okay but then went a little bit off the rails. Here’s what he read out as the answer to [what do dolphins eat?]:

Killer whales

He got a bit confused at “killer whales” and knew he was off-track, but wasn’t sure what had gone wrong.

I think the lesson here is that even though people may primarily use the obvious tools and affordances presented to them, they will also make potentially incorrect assumptions and risk being led astray by well-intentioned sign-posts in the UI.

Some other kids’ misconceptions

One child apparently thought that the autosuggest was a list of answers to the query he was typing. That doesn’t always work perfectly:

But to be fair, it’s not immediately obvious that UX like “people also ask” (which does come with embedded answers where possible):

Is entirely different to related searches which are not necessarily even suggested sensible questions:

And finally, to end on a light-hearted anecdote from the research, probably my favourite story was the child (not mine!) who looked for both dolphins and information about the Vice President of the United States on the SpongeBob SquarePants website.

Presumably unsuccessfully, at least in the case of the VP’s birthday.

If you liked this post, check out the whole session from my recent SearchLove talk in San Diego (all you need to do is create a Distilled account to access it for free). You can also check out the slides from my presentation below. Enjoy!


SearchLove San Diego 2018 | Will Critchlow | From the Horse’s Mouth: What We Can Learn from Google’s Own Words from Distilled

How To Create Content That Really Converts

Posted by on Jul 26, 2018 in SEO Articles | Comments Off on How To Create Content That Really Converts


“Content” is a catch-all term when it comes to internet marketing, covering anything that isn’t explicit advertising.

I use the word explicit deliberately. Content is, of course, designed to advertise a brand, but in a more subtle and accessible way. Indeed, this form of advertising is known to some as passive marketing, as opposed to active marketing, such as traditional advertising.

I prefer a different definition. Traditional advertising, whether it be commercials, pop-up ads or sponsored links, interrupts what you’re doing, demands your attention, and gets in the way. In that way, this can be thought of as intrusive marketing. As a business, you are going to the consumer. And they hate it.

By contrast, content marketing allows people to find it when looking for answers to their problems online. For that reason, we can think of content marketing as a form of organic marketing. The consumer comes to you.

Are you pushing messages to your target audience or attracting them?

The primary way this work is through the other intention of content marketing: content marketing is designed to help. Whether that’s through advice, recommendations, reviews, how to’s, expert knowledge or more eclectic ideas such as panel discussions. The list goes on and on. But every piece of content used for content marketing is designed to solve a problem.

When people look for the solution, they find the content marketing, which introduces them to the business, and when done well, encourages them to buy from that company.

Three Intentions of Content

Awareness – of the brand, the resident expertise, the product and so forth
Assistance – with problems people commonly face within the world of the product or service
Conversion – encouraging those people to see the upside of the offer, and buy

The first two parts are easy – awareness will happen naturally as a result of assistance. But conversion is a different animal altogether, and requires a careful balancing act.

After all, give too much away and there is no need to buy. Explain too little and you only frustrate your reader. Be too promotional and they feel tricked into reading an ad, and will punish you for it.

Strike balance between the amount of content to be shared and avoid being overly promotional. It’s difficult but you can do it.

You need to strike the balance, and that means any promotional aspect to the content must be storified and subconscious. Fortunately, that’s exactly what we’re going to talk about today.

Here are my five golden rules to creating content that really converts.

1. Feel Pain and Build Trust

We spend our lives trying to avoid pain. It’s the reason consumerism has taken over the globe. We see products as solutions to our problems, and ultimately, all our problems give us pain. It’s why we refer to consumer challenges as ‘pain points’.

You might think that pain isn’t something you should be inflicting if you want to sell products, and you’re right. Narrowly, but right. You shouldn’t be vindictive, or upsetting, or controversial in an attempt to hurt people. That will always fail.

But empathy, empathy is the foundation of sales. And by telling the story of pain, by evoking it instead of inflicting it, you can create a sense of kinship with your reader.

Let them know you care about them.

Pain is real, and it is human. No machine feels pain. Anything that makes your content look written by a real, genuine human is good. So what pain does your product or service alleviate? That’s where you start. That’s your strategy. Evoke that pain. Tell the story of that pain.

When people see that you understand personally, viscerally, what that pain, frustration, torment or insecurity feels like? They will start to trust you. And once you start to develop trust, you can start to convert sales.

2. Gain Respect By Watching Your Neighbors

Always keep your eyes on the market changes

In order for people to trust you they have to respect you, and to respect you, they need to feel like you’re an industry leader. Thought leadership is an increasingly common trend, and that’s all about getting out in front of competitors and defining a compelling, hopeful vision for the future of the business you’re in.

So, if you want to be both trusted and respected, and you’ll need to be if people are going to buy, then you need to be on top of your competitors at all times. You need your finger on the pulse of your industry and you need to be current. Watch out the market changes.

#AmazonGo opens on Monday, January 22 in Seattle. Get the app to enter the store. See you soon!

— (@amazon) January 21, 2018

You do not want to be the last one to know right! if your competitor is opening a new branch few blocks away from you.

That isn’t just about branch opening, share prices or product releases either. You need to be attuned to their communication. It’s a sad fact that every writer feels they can see the problems in another’s work, but for you that can be a blessing and a guide. Seeing what they’re doing, whether it’s good or bad, can give you something to emulate or something to develop. You can do what they do, or differentiate yourself depending on how successful it is.

I’m talking about their offer. I’m talking about their framing. Their call to action. Their landing page design. Their pitch. The way the content flows. It’s all there to be understood, you just have to analyse it, and with enough regularity that you don’t fall behind. This is a rapidly evolving discipline.

Follow the trends quickly and aggressively until you’re caught up on them, then take the understanding that’s given you and forge ahead. Lead the field. Consumers will follow the leader.

3. Tell A Compelling Story With Catharsis

This is everything, really. Anything that’s ever been successful did so because it told a story people wanted to hear. Every successful brand tells a good story, so make yours one of them.

4 Key elements to build a thoughtful, unique and emotional brand story:

Find your common ground – Know your key consumer insights and where to connect them.
Know your origin – Why you started the business and what is the main issue you are trying to solve.
Keep it positive – Be consistent and positive throughout the journey.
Stay on brand – All sort of visible perception of your market efforts and outreach are aligned with your brand.

Storytelling allows you to make your promotion indirect, and subconscious. It allows people to create a distance that allows them to invest more readily. Do you remember what I said about pain, and how our lives are spent avoiding it? That’s why we don’t want direct advertising, but seeing someone else struggle is the basis of every movie or TV show we’ve ever watched.

4 elements to tell a compelling brand story.

The Greeks invented theatre as a way to separate people from their emotions, so they could watch their pain simulated at a safe distance, and experience the release, or catharsis, that comes from their suffering ending. This “feeling box” has evolved, but it’s most concisely captured in Inception. He talks about how positive motivation is the most powerful way to implant an idea. Reconciliation with an estranged parent is a powerful motivator.

Of course, your motivator may not be THAT powerful, but you’re not in a conceptual sci fi blockbuster either – you’re trying to sell a product.

So put a cipher for your audience at the centre of the story – this can be a previous client, an apocryphal person, or the writer themselves. Explain how they feel, what they want, what they struggle with, and how the product came along and lifted that curse, provided the release from pain, the catharsis. Build up the emotions then release them.

One of the most fundamental examples is an extraordinary proposition for an advert: transform hate into love. Take all the worst parts of something and change them, and the world, for the better. Reframe hate as a seed from which love can grow. Of course, it was Honda.

Hate is one of the most powerful and destructive and upsetting emotions to evoke, and this ad makes it light and airy and constructive and positive. That’s a journey, and that’s what story is. Change.

4. Use Emotional Intelligence To Convert

Consumers make their decisions based on emotion. So all the cool facts in the world won’t matter if you don’t feel it. That’s thanks to something called the Basal Ganglia, which responds to what we feel but only communicates with the GI tract, and is totally disconnected from the rational part of the brain. It’s why we have to feel things.

So what are you waiting for? Start to evoke their emotion now!

My favourite feeling is frisson. A sudden rush of excitement, which also comes with a sense of recognition. How many times have you thought or heard, “I don’t know what I want, but I’ll know when I see it?” That.

By using evoking pain using storytelling, by having a unique perspective on that struggle through your thought leadership, and by providing a solution that is genuinely helpful, nurturing and altruistic, you become the modern equivalent of a spirit guide. The only difference is the language you use.

Which shoe looks more premium to you?

Contrasts that create surprise. These juxtapositions are the essence of a joke. It leads you down a path of expectation and flips it at the end. More on that here.

Be surprising, and you can shake people out of apathy and get them paying attention. There’s nothing worse than having your expectations fulfilled with no imagination.

There’s a totally fake quote out there, that Henry Ford said if he’d have given people what they WANTED, he’d have given them faster horses. That still exists because it’s such a wonderful image for us. But you really can’t ask them what they want and give it to them. There has to be more.

Instead, Ford invented the car. That’s the level of “wow” you should aspire to when revealing the twist in your tale. Your solution should be so beyond the initial crisis that it solves problems people didn’t even know they had. Like how buying a GoPro makes you an elite adventurer by selling you a lifestyle.

Surprise, recognition, frisson.

5. Use Testimonial Or Case Studies

Testimonials, comments and reviews matters!

Reality TV isn’t reality. Documentaries are edited for story. But we LOVE them, and we love them because we get to believe they’re real. As the X-Files told us, we want to believe.

The case study, and even better, the testimonial, are the ‘documentary’ of content marketing. They can be more powerful and more compelling than more general content because they’re written about or by people who already fit the consumer profile for the product.

What your client would talk about you?

What’s more, they storify their struggles and your solutions, making your arguments for you.

The authenticity is immediate and undeniable. Third parties have no reason to shill your product, so they must be responding out of genuine gratitude with a genuine recommendation.

But how do you get the most from them?

When you reach out to get testimonials and reviews, you need to provide prompts that will get your writers “on the rails” – give them a short feedback form. Ask:

Did they like it? What did they like the most?
How do they evaluate your service? Have they used competitors?
Would they recommend the product or service?

Then you can use pull-quotes, like movie posters, from the people who submit feedback.

If you need more control than that, or you work with larger clients who don’t have time to rub your belly in public, you can use case studies of your work on behalf of clients, storifying the process and inputting the hard facts into a classic structure will help you spin thrilling tales of your derring do to your audiences, without it ever feeling like you’re just showing off.

These forms of storytelling can vastly increase the confidence of a prospect on your product or service. After all, people sell to people is the oldest maxim in the marketing handbook.

The Go-Home

Remember, the call to action needs to fit the tone of the piece. You can’t write a beautiful and affecting and genuinely helpful piece of content then put a flashing BuyNow.Gif at the end. It won’t work and it’ll sour the whole experience.

Be helpful, be valuable. Be expert. You will gain trust and respect, which will make converting to sales gentler and easier.

And remember, this is a process. You need to constantly evaluate your content, using A/B testing, Google Analytics and other tools to track how successful your different ideas and approaches are, and make improvements based on data. All of this is just advice, and you still have to find the right way to execute it. I wish you luck.

Had a success story? Put in our comments section below!

float: left;
margin: 0;
width: 100%;
max-width: 654px;
height: 100%;
#optin-template-3 .container{
float: left;
width: 100%;
height: 100%;
text-align: center;
background: #fff;
padding-bottom: 16px;
#optin-template-3 .top-row{
display: inline-block;
width: 88%;
padding: 3% 6% 0%;
#optin-template-3 .top-row h2{
margin: 5px 0 0;
font-family: “roboto”, helvetica, sans-serif;
color: #de5840;
font-weight: 600;
text-align: center;
padding:0px 0px 5px;
#optin-template-3 .left-column{
display: inline-block;
width: 100%;
max-width: 270px;
min-width: 270px;
height: 100%;
vertical-align: top;
padding-top: 32px;
#optin-template-3 .ebook-img{
width: 100%;
height: 280px;
background: url(;
background-size: contain;
background-repeat: no-repeat;
#optin-template-3 .right-column{
display: inline-block;
width: 60%;
min-width: 250px;
max-width: 305px;
padding: 24px 4% 32px;
#optin-template-3 .bodycopy ul{
text-align: left;
padding-left: 0;
#optin-template-3 .bodycopy ul li{
font-family: “roboto”, helvetica, sans-serif;
margin-left: 20px;
#optin-template-3 .optIn-form{
display: block;
bottom: 0;
#optin-template-3 .email{
display: block;
width: 100%;
border: 0;
padding: 8px 0;
font-size: 18px;
text-align: center;
border: 1px solid #de5840;
#optin-template-3 .submit-button{
display: block;
margin-top: 4%;
width: 100%;
padding: 8px 0;
font-family: “roboto”, helvetica, sans-serif;
font-weight: 400;
color: #fff;
background: #de5840;
font-size: 21px;
border: 0;
outline: 1px solid #de5840;
cursor: pointer;
#optin-template-3 .container .align-justify { text-align:justify !important;}

Checklist: Are You Providing What Your Readers Love Reading?

The ultimate checklist for every content marketer.
Learn what makes your readers tick
Get more new readers and traffic with this step-by-step checklist
20 tips included to help you create compelling content that readers love

Keyword cannibalization

Posted by on Jul 26, 2018 in SEO Articles | Comments Off on Keyword cannibalization


Keyword cannibalization means that you have various blog posts or articles on your site that can rank for the same search query in Google. If you optimize posts or articles for similar search queries, they’re eating away each other’s chances to rank. Here, I’ll explain why keyword cannibalism is bad for your SEO, how you can recognize keyword cannibalization and how to solve it.

Learn how to write awesome and SEO friendly articles in our SEO Copywriting training »

$199 – Buy now » Info

What is keyword cannibalization?

If you optimize your articles for similar terms, you might suffer from keyword cannibalization: you’ll be devouring your own chances to rank in Google. Google will only show 1 or 2 results from the same domain in the search results for any specific query. If you’re a high authority domain, you might get away with 3.

Why is keyword cannibalism bad for SEO?

If you cannibalize your own keywords, you’re competing with yourself for ranking in Google. Let’s say you have two posts on the same topic. In that case, Google isn’t able to distinguish which article should rank highest for a certain query. As a result, they’ll probably both rank lower. Therefore our SEO analysis will give a red bullet whenever you optimize a post for a focus keyword you’ve used before.

But, keyword cannibalism can also occur if you optimize posts for focus keywords that are not exactly, but almost the same. For instance, I wrote two posts about whether or not readability is a ranking factor. The first post is optimized for ‘does readability rank’, while the second post is optimized for the focus keyword ‘readability ranking factor’. The posts have a slightly different angle but are still very similar. For Google, it is hard to figure out which of the two article is most important. As a result, you could end up ranking low with both articles.

How to recognize keyword cannibalization?

Checking whether or not your site suffers from keyword cannibalism is rather easy. You should search your site for any specific keyword you suspect might have multiple results. In my case, I’ll google readability ranks. The first two results are the articles I suspected to suffer from cannibalization.

Googling ‘ “keyword” will give you an easy answer to the question whether you’re suffering from keyword cannibalism.

Solve keyword cannibalization with internal linking

You can help Google to figure out which article is most important, by setting up a decent internal linking structure.  You should link from posts that are less important, to posts that are the most important to you. That way, Google can figure out (by following links) which ones you want to pop up highest in the search engines.

Your internal linking structure could solve a part of your keyword cannibalism problems. You should think about which article is most important to you and link from the less important long tail articles, to your most important article. Read more about how to do this in my article about ranking with cornerstone content.

Solve keyword cannibalism by combining articles

In many cases, the best way to solve the keyword cannibalization problem is by combining articles. Find the articles that focus on similar search queries. If two articles are both attracting the same audience and are basically telling the same story, you should combine them. Rewrite the two post into one amazing, kickass article. That’ll really help with your ranking (Google loves lengthy and well-written content) and solve your keyword cannibalization problem. That’s exactly what I should do with my two posts about whether or not readability is a ranking factor. In the end, you’ll delete one of the two articles and adapt the other one. And don’t forget: don’t just press the delete button; always make sure to redirect the post you delete.

Keyword cannibalism will affect growing websites

If your site gets bigger, your chances increase to face keyword cannibalism on your own website. You’ll be writing about your favorite subjects and without even knowing it, you’ll write articles that end up rather similar. That’s what happened to me too. Once in a while, you should check the keywords you want to rank for the most. Make sure to check whether you’re suffering from keyword cannibalism. You’ll probably need to make some changes in your site structure or to rewrite some articles every now and then.

Read more: Keyword research: the ultimate guide »

The post Keyword cannibalization appeared first on Yoast.

Thin Content – What It Is and How To Fix It

Posted by on Jul 26, 2018 in SEO Articles | Comments Off on Thin Content – What It Is and How To Fix It


You’ve optimized your content with SEO. Now you’re definitely going to rank for your keywords. But it didn’t happen. You can’t seem to find your content anywhere.

It happened to me and that sucked.

You’re probably wondering why this happened after you’ve put in so much effort to optimize your content with SEO. Is it a content issue?

The unfortunate answer is – yes, it is a content issue. Quite a while back, Google released an update named Google Panda, and this adorable black and white creature is the one who is penalizing you due to your content quality.

Google does have a very good reason for penalizing websites with low-quality or “thin” content. Prior to the update, short and spammy pages were a common sight. This defeats Google’s purpose since providing valuable information to users is the goal.

As a content creator, you would want to provide genuinely useful content to users, but there could be issues holding you back that you were not aware of. To rank well and help users see you and get to know you, you would need to avoid thin content.

What is Thin Content?

Thin content is what Google considers to be of little value. You’ve probably come across a few when you were looking for some answers and came across pages that did not answer your question. It’s quite easy to identify thin content:

Not original or unique content
A low word count
Comes from an external sources
The topic is only covered in a shallow manner
Doesn’t seem to serve any purpose
Has plenty of spelling or grammatical mistakes
Is generally not well written
Packed with keywords
Is not very informative

While it is not always the intention of the writers to produce thin content, there’s just no way for a search engine to understand that. The only thing we can do is to take necessary steps to avoid it.

Is Duplicate Content The Same As Thin Content?

There are several types of duplicate content. The first one is complete copies of a page but has multiple URLs in one website or even multiple websites that always bring you back to the same page. Since Google views every URL as a unique one, every copy that is found will cause your content to be considered thinner.

The first is a complete copy. These are completely identical content that is pasted across different pages or websites. Every copy that exists will be penalized and Panda will chew you up like bamboo stalks.

The next one is partial copies of the content across multiple pages or websites. While it isn’t completely identical, if the copied parts are the core message, you would definitely get swiped by the Panda.

And then we have pages that jam-packed with ads, which results in less unique content. I’m talking about pages that have ads screaming at you from most conceivable angles. If you’re looking to benefit from the maximum amount of ads while getting your page to rank, then I have bad news for you.

There has been news of an unconfirmed Google ranking update called Fred who shreds websites with low-value content and prioritizes earning revenue. Sites supposedly targeted by Fred reported a strong traffic decline from Google organic search. We’re talking about 50% to 90% of traffic drop. Besides obvious ads, other types of revenue generation like affiliate models and lead generation are also said to be affected. So don’t think that Panda is the only one out to get you. If you want your pages to be considered high quality, you must have at least the same amount ratio of unique content compared to the number of ads. Above all else, your content must be aimed at helping users because that’s what they’re supposed to be for.

So, thin content and duplicate content are not the same. However, they’re both content issues that would lower the quality of your pages and ultimately cause you to be taken less seriously by Google. And that’s definitely something we don’t want.

It’s Sapping Your SEO Health

Search engine bots are entirely different entities. They don’t function like us and therefore will not understand and categorize websites the same way. They figure out what your website is all about by crawling the content.

The best way to establish authority and credibility on your chosen topic or industry is to have your SEO content rank high for it. The more your content focuses on your niche, the easier it is for the crawlers to index and categorize and decide how relevant and serious you are for it.

The opposite is also true. Thin content is a content issue and would render your SEO efforts less than effective. It also slows down the user journey and makes it harder for them to find what they are looking for.

How To Fix Thin Content

So you may have found some content on your site that would qualify as thin or duplicated content. But relax, it doesn’t mean that you have to be ready to destroy them on sight. While deletion is an acceptable method, there are also different ways to deal with thin content, such as:

1) Expand

Imagine your content as a scrumptious dish you are trying to cook that will appeal to not only users but also Google. If your dish is lackluster and or dull, you can choose to improvise and expand your dish through more ingredients that compliment the dish and makes it better. For example, if your product page is thin, you can add unique content in the form of a detailed explanation. Your readers would be able to find information on your page without having to leave the page. This would be useful to your readers and helps make your site look good in Google’s eyes. It would probably increase the chances of you making a sale consumer-wise too.

2) Rewrite

Sometimes, your recipe may already have all the important ingredients for the dish but it could have more to do with how it looks like as a final product. So what you can do is take all the information that you already have and present it in a manner that is more easily understood and more purposeful. Essentially, you rewrite it. Make sure it’s well-written with relevant information and not just keyword packed.

3) Remove

If none of those works for you, then perhaps it’s time to consider that there might be a problem with the recipe. Why not try replacing it with another one that could work better. For your content, this means to scrap the whole thing and write in new content. After all, your dish must appeal to readers for you to rank high.

4) Expand

If all else fails, then you can always give up and consign your dish to the bin. Rather than letting your content issues bog down your ranking, it’s better to remove it. However, do it with a lot of care. Removing huge sections of your site would be the equivalent of amputating your body parts and may cause negative SEO effects instead.

Tools At Your Disposal

There are some tools that can help identify and prevent duplicated and thin content.

1) Grammarly

This is AI-powered and is used commonly to help detect writing mistakes. Simply enter your content and it will function as a proofreader and a plagiarism checker. The proofreader would make your content that much more reader-friendly and the plagiarism checker can crosscheck your content against billions of web pages and detect if there are plagiarized passages. Fix the plagiarized passages and your content issues will become less of a target to Panda.

2) Copyscape

It’s nifty for pages that are already published. Just enter your URL and Copyscape can tell you if anyone has copies of your web pages online, and if you’ve plagiarised off someone’s content.The free option is limited to checking these but it’s a convenient tool to have at your disposal.

3) SEOPressor

Not to toot our own horn but it’s a great tool to have during your writing process. So if your aim is to expand, rewrite, or replace your thin content, you can give this a try. Just for Panda avoidance, SEOPressor helps you keep your content length, keywords to content ratio, and readability ideal. I’m a fan of preventing issues and with this tool, you address problems before it becomes one.

4) Content SEO Checker

It’s essentially a free version of SEOPressor albeit with fewer functions. It’s more than sufficient if you want to compose your content from scratch and head to a mostly Panda-free zone.

Keeping Your Website Safe from Panda

The goal is to give a great impression to Google that your content is rich and unique. There aren’t any tools that can be considered a thin content tool in the sense that there isn’t yet a machine that can help you to produce unique content. However, you can find a duplicate content checker quite easily. We really only need time, experience, and research to help identify thin and duplicate content and make them better. It is also in your best interest to understand that Google wants to present the best and most useful answers to users.

To combat possible future updates, you can also ensure your site is engaging and of high quality. Creating great content may even help you to rank better!

float: left;
margin: 0;
width: 100%;
max-width: 654px;
height: 100%;
#optin-template-3 .container{
float: left;
width: 100%;
height: 100%;
text-align: center;
background: #fff;
padding-bottom: 16px;
#optin-template-3 .top-row{
display: inline-block;
width: 88%;
padding: 3% 6% 0%;
#optin-template-3 .top-row h2{
margin: 5px 0 0;
font-family: “roboto”, helvetica, sans-serif;
color: #FFAB40;
font-weight: 600;
text-align: center;
padding:0px 0px 5px;
#optin-template-3 .left-column{
display: inline-block;
width: 100%;
max-width: 270px;
min-width: 270px;
height: 100%;
vertical-align: top;
padding-top: 32px;
#optin-template-3 .ebook-img{
width: 100%;
height: 280px;
background: url(;
background-size: contain;
background-repeat: no-repeat;
#optin-template-3 .right-column{
display: inline-block;
width: 60%;
min-width: 250px;
max-width: 305px;
padding: 24px 4% 32px;
#optin-template-3 .bodycopy ul{
text-align: left;
padding-left: 0;
#optin-template-3 .bodycopy ul li{
font-family: “roboto”, helvetica, sans-serif;
margin-left: 20px;
#optin-template-3 .optIn-form{
display: block;
bottom: 0;
#optin-template-3 .email{
display: block;
width: 100%;
border: 0;
padding: 8px 0;
font-size: 18px;
text-align: center;
border: 1px solid #FFAB40;
#optin-template-3 .submit-button{
display: block;
margin-top: 4%;
width: 100%;
padding: 8px 0;
font-family: “roboto”, helvetica, sans-serif;
font-weight: 400;
color: #fff;
background: #FFAB40;
font-size: 21px;
border: 0;
outline: 1px solid #FFAB40;
cursor: pointer;
#optin-template-3 .container .align-justify { text-align:justify !important;}

18 Essential SEO Tools To Optimize Your Website

An up-to-date list of SEO tools for every marketer to optimize your website.
Identify 18 practical tools that save your time to optimize manually
Get more traffic and higher ranking with these tools
Discover the benefits of every tool to help strengthen your SEO strategy

How High Page Load Time Is Affecting Your Conversion

Posted by on Jul 26, 2018 in SEO Articles | Comments Off on How High Page Load Time Is Affecting Your Conversion


Let’s say you need to check out something real quick. You type out your query in your browser on your laptop or pc. Google loads your search results page and you pick the top ranking ones to get your answer. Then you wait for the load time… 1, 2, 3, 4, 5 seconds have passed but the page continues to load at its own pace. What do you do after that?

The likely scenario is, you would leave and try another page.

We live in the era of instant gratification. People want things fast and now. By the way, when I say website page speed, I mean how long your web page needs to complete its rendering. If your web page load time is too slow, then you’ve probably already lost a lot of users who have gone on to someone faster.

Can A Few Seconds Affect Bounce Rates?

According to Think With Google, those 1 or 2 seconds you’re missing out on are pretty important. By the time your loading time hits 6 seconds, the probability of the bounce rates has already exceeded 100%.

Image Source: Think From Google

Coming back to your consumers, Forrester Consulting’s survey actually found that people expect your page load time to be within 2 seconds or less. 40% of them won’t wait longer than 3 seconds before abandoning a website.

Online shoppers are a fickle lot but more than half of those have stated that their loyalty is dependent on the short load time. This is especially true for high-spending shoppers. If your website page speed is low, then your shoppers would become distracted. 14% of them would start straying to other sites and shopping there instead. 23% of them will stop shopping and walk away from their computer altogether.

Retail and travel sites are some of the most affected when their website page speed is low. Almost 8 out of 10 would not buy again if they felt dissatisfied with this aspect. What’s worse is, 64% of those would bring their business elsewhere and buy from someone else.

So if you have not done any web page optimization, you’re definitely losing leads and conversion. Even if you’re ranking high in Google search at the moment, your content is failing to bring you sales because your consumers have left your website even before your content has loaded.

How Slow Is A Slow Load Time?

Google’s page speed penalization goes back as far as 2009 so it’s been around for a long time. But, mobile usage has outstripped desktop usage and has been projected to continue doing so from now onwards. Google has taken this trend into account and has announced that they will start accounting for mobile page speed in their ranking algorithm starting from July 2018.

So what does this mean for you? Being in business means you have to move with the times. If you’re not advertising or optimized for mobile devices, then you’re losing out on many prospects.

Then your next question would be “what is considered slow?”. That depends on your niche and how fast you are compared to your competitors. Google themselves have experimented with page speed and have discovered a deep 20% traffic drop when the results page loaded for just half a second slower. While having relevant content helps you in terms of ranking, it’s just a shame to be ranked lower for seconds of difference.

Your Load Time is Affecting Your Adwords

What?! Yup, it is. Google claims that the update will only affect pages that deliver the slowest experience to users and will only affect a small percentage of queries. But your Quality Score will be affected because your Landing Page Experience will be impacted by low website page speed.

The reasoning is simple. If Google brought prospects to your website but your information doesn’t load quickly enough (assuming that your content is relevant and useful), you’re going to end up paying more or failing to show an ad altogether.

Relevant and compelling content is important but optimizing your page load time is equally vital to help your users get where they’re going faster and more efficiently.

What’s Lowering Your Website Page Speed?

There is a long list of reasons on what’s likely to cause your page’s load time to be long. Some of the possibilities are:

Huge Images – Some images are too heavy to load and is pulling your website page speed down
Cheap Host – Pay peanuts, get monkeys. Cheaper doesn’t always mean better. Go for the one that’s good for your business size
External Media – Some media like videos can really add to the value of your content but can also negatively affect your load time. You can try hosting on your own server to speed things up a bit
Overwhelming Ads – There’s such a thing as too much. Having too many ads will annoy your users, make your content look thin, and also slow down your page speed
Theme Design – Some themes are so pretty with all the effects but it can also cause a slow down page speed wise

Tools to Test Your Page Speed

If you need a website load test tool, you’re in luck. There’s quite a lot of tools on the Internet that you can use to measure your website page speed. Some that we find best (easy and free to use) are:

1) PageSpeed Insights

Google has developed their own tool that analyzes the content of a web page, then generates suggestions to help you make your page faster. It’ll be divided conveniently into desktop and mobile to help you identify which end needs more work. You simply need to enter your web page URL.

2) WebPagetest

Also a tool by Google, it’s an open source project that they made in their efforts to make the web faster. It’s hosted by companies and individuals around the globe. Enter your URL, choose a test location, and a browser, and it will run 3 tests and give you the performance results.

3) Mobile Site Speed Test

Yet another Google related speed test tool (they’re taking it seriously, you should too), this one is for mobile speed. After entering the URL, they will test and show you how quick you need to load, the estimated percentage of visitor loss, your mobile site speed compares to those in the same industry, and how much loading time can be reduced through a few fixes.

Speed Optimization Techniques You Can Try

After testing your website with the tools above, you should receive some suggestions on how to improve your website page speed. Here are some website optimization tips you can try to trim off a few more seconds of load time:

Optimize your images

You can reduce the size of your images if you use the “Save for Web” option offered by programs like Photoshop. If you don’t have that program, you can opt for free ones like Web Resizer or Pic Resize. Try not to use HTML to resize images (I’m talking about WordPress blogs). The picture may look smaller, but the process still calls for the whole image to be loaded, and then resized to what you want. So, there won’t be a reduction in load time.

Cache your pages

Rather than generating the page every time someone visits, you can consider using plugins that content management systems have (like WordPress). It can cache your pages and just displays that to your users.

Code Minification

Minification for computer languages means to get rid of all unnecessary characters from the source code without changing the functionality. It would mean fewer things to load, which translates to fast load time.

Quick Loading Above-The-Fold Content

For those of you who are wondering, Above-The-Fold (ATF) Content is the portion of your page that you can see without scrolling. Most people may not even scroll to view the rest of your content, so put more effort into making sure at least this portion loads fairly quickly.

Getting rid of render-blocking Javascript and CSS

You might see this when you try the tools above. What the tools are telling you, is that you have Javascript and CSS scripts that are slowing down your page load time unnecessarily. The scripts are not part of the ATF content and need not come into play. So just say goodbye and do away with them.

Compress your pages

Your files will load faster if your files are smaller. Enable GZIP Compression and you will be able to load at 70% faster without loss of quality for images, video, or website. Most browsers do support this software application, but you can always test it out before using.


While website page speed isn’t going to be the reason that bumps you up to the first page, it does become a detracting factor which lowers your ranking and definitely drag down your conversion rates. It affects how your users will experience your website. For anyone you want to inspire confidence in and build trust, you shouldn’t keep them waiting for you. Give them a good experience and they will tell their friends and family about your page.

Although reducing the page load time is definitely going to be challenging, it will surely impact your website positively. Rome isn’t built in a day, so you don’t have to do everything today. But do make the effort to work through the recommended actions. They’re totally worth it!

With the upcoming mobile speed soon to come into play in July 2018, mobile responsiveness and speed optimization become the next thing to watch out for. It’s always better to be safe than sorry, so make your preparations and ride the waves of change!

#2 – High-Impact vs. Low-Impact SEO Actions

Posted by on Jul 26, 2018 in SEO Articles | Comments Off on #2 – High-Impact vs. Low-Impact SEO Actions

There are hundreds of actions you can take in a single SEO campaign, but not every action is created equally. That’s why in Episode #2 of The SEO Life podcast, I’ll be explaining the difference between high-impact and low-impact SEO actions.

Understanding the difference will not only save you precious time, but will also dramatically improve your SEO results.

Let’s jump in.

6 High-Impact SEO Actions to Focus on

I’ll start with the high-impact actions that every SEO should be focusing on or giving to a team member.

1. Creating Content Assets

The first and most important high-impact SEO action is creating SEO content assets. Content is the lead domino for any effective long-term SEO campaign. Not only can creating content around specific keyword phrases drive traffic to your website, but your content can also be your primary asset for acquiring backlinks as well.

It’s extremely challenging to acquire backlinks without content assets. But they will make your life a hell of lot easier when you have them.

I truly believe most of the focus on an SEO campaign should be on creating incredible content assets.

2. Performing Content Audits

The second high-impact action in my experience is performing content audits. Most websites don’t create content with SEO in mind and that’s why performing a content audit is critical in most cases.

An effective audit will help you identify pages that are outdated, thin, duplicate, or just outright bad. You can then decide to delete the pages, redirect them, or improve them and that decision is based on the existing data.

For example, if a page sucks, but it has 20 linking root domains, it may be worth the effort to improve it. However, if it isn’t worth improving or it’s no longer relevant, then you can redirect it to a relevant page on your website. This will help you retain the backlink equity.

3. Optimizing Site Architecture

The third high-impact action you can take is optimizing your website’s architecture. In short, an effective website architecture will help Google crawl your website more efficiently, will help users flow through your website effortlessly, and will flow backlink equity through your website. This alone will make your website more authoritative without needing a ton of backlinks.

You should be trying to squeeze as much authority of every backlink you get, so your website becomes an authoritative powerhouse. Acquiring backlinks is expensive and time-consuming, so it’s a wise idea to try to get the most out of them as you can. That’s why developing an effective site architecture is a good use of your time and effort.

4. Optimizing Technical/UX Performance

The fourth high-impact SEO action is to optimize your site’s technical and User Experience (UX) performance. Now this one is tricky because there are high-impact and low-impact actions within it.

I’ll focus on the high-impact actions because I’ll be hitting the low-impact ones in a second.

The high-impact technical issues that you must tackle are site loading speed, mobile friendliness, and crawling/indexing. Medium impact actions include fixing redirect issues, fixing broken links, and handling 404s (that have link equity).

5. Acquiring Backlinks

The fifth high-impact SEO action is trying to acquire backlinks. Backlinks are the necessary fuel for almost every successful SEO campaign. Out of the thousands of keywords I’ve tackled over the last 5 years, I’ve only had a couple that ranked without backlinks. You need quality backlink in almost every scenario (whether you like it or not).

As I mentioned before, using a content-centric approach is the most scalable link acquisition strategy. But some non-content dependent link building tactics include landing guest posting, getting media mentions about your company, participating in expert roundups, getting listed in a relevant blogroll, or contributing to HARO requests.

6. Building Systems

The sixth and final high-impact SEO action is to build systems. I focused on this topic in Episode #1 of the SEO Life podcast, so make sure listen to that one to learn more. But to recap, there are many micro actions within the larger actions I mentioned above that can be documented, systemized, and outsourced so that you can spend your time on high-level strategy and high-impact actions.

4 Low-Impact SEO Actions to Avoid

Now it’s time for me to cover the low-impact SEO actions that you should either outsource, avoid, or don’t even think about.

1. Rewriting META Descriptions, Messing with ALT Tags, Etc

The first low-impact action is spending your time rewriting meta descriptions, messing with ALT tags, or fixing broken links. ANYONE can do these actions and that should be your basis for what you should be spending your time on. If ANYONE can do what you’re doing with little effort, thought, or guidance, then it’s a low-impact action.

As I mentioned in Episode #1, you need to be careful not to spend your time on Minimum Wage Activities. This isn’t a jab at people who work for minimum wage because I’ve worked for minimum wage several times in my life. But I also learned that if you want to maximize your time, grow your business, and make more money, you need to be extremely careful with what you spend your time on.

Think about how you can be more effective, not more efficient, as Peter Drucker has said.

Don’t forget that you cannot get your time back. That’s why it’s critical that you think about every assignment you get or attempt to give yourself. Can you systemize the assignment and hand it out to someone else? Always ask yourself similar questions before you get into the weeds of a project.

2. Caring About Keyword Density

The second low-impact action is spending your time manipulating or caring about keyword density. Do you know how much time I spend on keyword density for every new keyword-targeted content asset I create…?

…0 seconds because it’s a huge waste of time. Place your target keyword in the title, the URL, the first sentence, and then focus on writing naturally.

You’ll end up mentioning the keyword phrase or variations without even thinking about it. In fact, I create my content FIRST and then add the target keyword in the places I mentioned above.

Focus on creating incredible content, not trying to trick Google’s algorithm into thinking your content is worthy. Just make it worthy and your Google keyword rankings will stick.

3. Getting Non-Editorial Backlinks

The third low-impact action is spending your time getting non-editorial backlinks such as profile links, forum signature links, or any submission-based links. I’ll admit there are some diamonds in the rough on the niche and local level, but 99% of these links are a waste of time and will have little or no positive impact on your results. Plus, even if you want to get these links, it’s a process that can be easily systematized and outsourced.

4. Caring Too Much About Third Party Metrics

The last low-impact action that I can think of is caring too much about third party metrics.

Wow, I really hate this one a lot.

It makes me cringe every time someone asks me “how can I increase my DA?” or they frantically ask “My Trust Flow dropped, what do I do!?!” These are THIRD PARTY METRICS.

The only KPIs that should care about when it comes to SEO performance is your organic search traffic.

Listen to me carefully: THIRD PARTY METRICS DON’T MATTER.

You can have a successful SEO campaign without caring about DA, DR, or Trust Flow even for a second. Ahrefs, Majestic, Open Site Explorer, and SEMRush are fundamental tools for an SEO campaign, but their in-house metrics should NOT be guiding your SEO decisions or fueling any insecurities you might have about your campaign.

Focus on organic search traffic inside Google Analytics. If that goes up, you’re doing well. If it goes down, then you need to diagnose it. Simple as that.

That’s It!

I hope you enjoyed Episode #2 of The SEO Life Podcast! If got value from it, please subscribe and share because I want to help as many people as possible. We’ll talk soon! Thanks for listening.

Rewriting the Beginner’s Guide to SEO, Chapter 2: Crawling, Indexing, and Ranking

Posted by on Jul 25, 2018 in SEO Articles | Comments Off on Rewriting the Beginner’s Guide to SEO, Chapter 2: Crawling, Indexing, and Ranking


Posted by BritneyMuller

It’s been a few months since our last share of our work-in-progress rewrite of the Beginner’s Guide to SEO, but after a brief hiatus, we’re back to share our draft of Chapter Two with you! This wouldn’t have been possible without the help of Kameron Jenkins, who has thoughtfully contributed her great talent for wordsmithing throughout this piece.

This is your resource, the guide that likely kicked off your interest in and knowledge of SEO, and we want to do right by you. You left amazingly helpful commentary on our outline and draft of Chapter One, and we’d be honored if you would take the time to let us know what you think of Chapter Two in the comments below.

Chapter 2: How Search Engines Work – Crawling, Indexing, and Ranking First, show up.

As we mentioned in Chapter 1, search engines are answer machines. They exist to discover, understand, and organize the internet’s content in order to offer the most relevant results to the questions searchers are asking.

In order to show up in search results, your content needs to first be visible to search engines. It’s arguably the most important piece of the SEO puzzle: If your site can’t be found, there’s no way you’ll ever show up in the SERPs (Search Engine Results Page).

How do search engines work?

Search engines have three primary functions:

Crawl: Scour the Internet for content, looking over the code/content for each URL they find.
Index: Store and organize the content found during the crawling process. Once a page is in the index, it’s in the running to be displayed as a result to relevant queries.
Rank: Provide the pieces of content that will best answer a searcher’s query. Order the search results by the most helpful to a particular query.
What is search engine crawling?

Crawling, is the discovery process in which search engines send out a team of robots (known as crawlers or spiders) to find new and updated content. Content can vary — it could be a webpage, an image, a video, a PDF, etc. — but regardless of the format, content is discovered by links.

The bot starts out by fetching a few web pages, and then follows the links on those webpages to find new URLs. By hopping along this path of links, crawlers are able to find new content and add it to their index — a massive database of discovered URLs — to later be retrieved when a searcher is seeking information that the content on that URL is a good match for.

What is a search engine index?

Search engines process and store information they find in an index, a huge database of all the content they’ve discovered and deem good enough to serve up to searchers.

Search engine ranking

When someone performs a search, search engines scour their index for highly relevant content and then orders that content in the hopes of solving the searcher’s query. This ordering of search results by relevance is known as ranking. In general, you can assume that the higher a website is ranked, the more relevant the search engine believes that site is to the query.

It’s possible to block search engine crawlers from part or all of your site, or instruct search engines to avoid storing certain pages in their index. While there can be reasons for doing this, if you want your content found by searchers, you have to first make sure it’s accessible to crawlers and is indexable. Otherwise, it’s as good as invisible.

By the end of this chapter, you’ll have the context you need to work with the search engine, rather than against it!

Note: In SEO, not all search engines are equal

Many beginners wonder about the relative importance of particular search engines. Most people know that Google has the largest market share, but how important it is to optimize for Bing, Yahoo, and others? The truth is that despite the existence of more than 30 major web search engines, the SEO community really only pays attention to Google. Why? The short answer is that Google is where the vast majority of people search the web. If we include Google Images, Google Maps, and YouTube (a Google property), more than 90% of web searches happen on Google — that’s nearly 20 times Bing and Yahoo combined.

Crawling: Can search engines find your site?

As you’ve just learned, making sure your site gets crawled and indexed is a prerequisite for showing up in the SERPs. First things first: You can check to see how many and which pages of your website have been indexed by Google using “”, an advanced search operator.

Head to Google and type “” into the search bar. This will return results Google has in its index for the site specified:

The number of results Google displays (see “About __ results” above) isn’t exact, but it does give you a solid idea of which pages are indexed on your site and how they are currently showing up in search results.

For more accurate results, monitor and use the Index Coverage report in Google Search Console. You can sign up for a free Google Search Console account if you don’t currently have one. With this tool, you can submit sitemaps for your site and monitor how many submitted pages have actually been added to Google’s index, among other things.

If you’re not showing up anywhere in the search results, there are a few possible reasons why:

Your site is brand new and hasn’t been crawled yet.
Your site isn’t linked to from any external websites.
Your site’s navigation makes it hard for a robot to crawl it effectively.
Your site contains some basic code called crawler directives that is blocking search engines.
Your site has been penalized by Google for spammy tactics.

If your site doesn’t have any other sites linking to it, you still might be able to get it indexed by submitting your XML sitemap in Google Search Console or manually submitting individual URLs to Google. There’s no guarantee they’ll include a submitted URL in their index, but it’s worth a try!

Can search engines see your whole site?

Sometimes a search engine will be able to find parts of your site by crawling, but other pages or sections might be obscured for one reason or another. It’s important to make sure that search engines are able to discover all the content you want indexed, and not just your homepage.

Ask yourself this: Can the bot crawl through your website, and not just to it?

Is your content hidden behind login forms?

If you require users to log in, fill out forms, or answer surveys before accessing certain content, search engines won’t see those protected pages. A crawler is definitely not going to log in.

Are you relying on search forms?

Robots cannot use search forms. Some individuals believe that if they place a search box on their site, search engines will be able to find everything that their visitors search for.

Is text hidden within non-text content?

Non-text media forms (images, video, GIFs, etc.) should not be used to display text that you wish to be indexed. While search engines are getting better at recognizing images, there’s no guarantee they will be able to read and understand it just yet. It’s always best to add text within the <HTML> markup of your webpage.

Can search engines follow your site navigation?

Just as a crawler needs to discover your site via links from other sites, it needs a path of links on your own site to guide it from page to page. If you’ve got a page you want search engines to find but it isn’t linked to from any other pages, it’s as good as invisible. Many sites make the critical mistake of structuring their navigation in ways that are inaccessible to search engines, hindering their ability to get listed in search results.

Common navigation mistakes that can keep crawlers from seeing all of your site:

Having a mobile navigation that shows different results than your desktop navigation
Any type of navigation where the menu items are not in the HTML, such as JavaScript-enabled navigations. Google has gotten much better at crawling and understanding Javascript, but it’s still not a perfect process. The more surefire way to ensure something gets found, understood, and indexed by Google is by putting it in the HTML.
Personalization, or showing unique navigation to a specific type of visitor versus others, could appear to be cloaking to a search engine crawler
Forgetting to link to a primary page on your website through your navigation — remember, links are the paths crawlers follow to new pages!

This is why it’s essential that your website has a clear navigation and helpful URL folder structures.

Information architecture

Information architecture is the practice of organizing and labeling content on a website to improve efficiency and fundability for users. The best information architecture is intuitive, meaning that users shouldn’t have to think very hard to flow through your website or to find something.

Your site should also have a useful 404 (page not found) page for when a visitor clicks on a dead link or mistypes a URL. The best 404 pages allow users to click back into your site so they don’t bounce off just because they tried to access a nonexistent link.

Tell search engines how to crawl your site

In addition to making sure crawlers can reach your most important pages, it’s also pertinent to note that you’ll have pages on your site you don’t want them to find. These might include things like old URLs that have thin content, duplicate URLs (such as sort-and-filter parameters for e-commerce), special promo code pages, staging or test pages, and so on.

Blocking pages from search engines can also help crawlers prioritize your most important pages and maximize your crawl budget (the average number of pages a search engine bot will crawl on your site).

Crawler directives allow you to control what you want Googlebot to crawl and index using a robots.txt file, meta tag, sitemap.xml file, or Google Search Console.


Robots.txt files are located in the root directory of websites (ex. and suggest which parts of your site search engines should and shouldn’t crawl via specific robots.txt directives. This is a great solution when trying to block search engines from non-private pages on your site.

You wouldn’t want to block private/sensitive pages from being crawled here because the file is easily accessible by users and bots.

Pro tip:
If Googlebot can’t find a robots.txt file for a site (40X HTTP status code), it proceeds to crawl the site.
If Googlebot finds a robots.txt file for a site (20X HTTP status code), it will usually abide by the suggestions and proceed to crawl the site.
If Googlebot finds neither a 20X or a 40X HTTP status code (ex. a 501 server error) it can’t determine if you have a robots.txt file or not and won’t crawl your site.
Meta directives

The two types of meta directives are the meta robots tag (more commonly used) and the x-robots-tag. Each provides crawlers with stronger instructions on how to crawl and index a URL’s content.

The x-robots tag provides more flexibility and functionality if you want to block search engines at scale because you can use regular expressions, block non-HTML files, and apply sitewide noindex tags.

These are the best options for blocking more sensitive*/private URLs from search engines.

*For very sensitive URLs, it is best practice to remove them from or require a secure login to view the pages.

WordPress Tip: In Dashboard > Settings > Reading, make sure the “Search Engine Visibility” box is not checked. This blocks search engines from coming to your site via your robots.txt file!

Avoid these common pitfalls, and you’ll have clean, crawlable content that will allow bots easy access to your pages.

Once you’ve ensured your site has been crawled, the next order of business is to make sure it can be indexed. That’s right — just because your site can be discovered and crawled by a search engine doesn’t necessarily mean that it will be stored in their index. Read on to learn about how indexing works and how you can make sure your site makes it into this all-important database.


A sitemap is just what it sounds like: a list of URLs on your site that crawlers can use to discover and index your content. One of the easiest ways to ensure Google is finding your highest priority pages is to create a file that meets Google’s standards and submit it through Google Search Console. While submitting a sitemap doesn’t replace the need for good site navigation, it can certainly help crawlers follow a path to all of your important pages.

Google Search Console

Some sites (most common with e-commerce) make the same content available on multiple different URLs by appending certain parameters to URLs. If you’ve ever shopped online, you’ve likely narrowed down your search via filters. For example, you may search for “shoes” on Amazon, and then refine your search by size, color, and style. Each time you refine, the URL changes slightly. How does Google know which version of the URL to serve to searchers? Google does a pretty good job at figuring out the representative URL on its own, but you can use the URL Parameters feature in Google Search Console to tell Google exactly how you want them to treat your pages.

Indexing: How do search engines understand and remember your site?

Once you’ve ensured your site has been crawled, the next order of business is to make sure it can be indexed. That’s right — just because your site can be discovered and crawled by a search engine doesn’t necessarily mean that it will be stored in their index. In the previous section on crawling, we discussed how search engines discover your web pages. The index is where your discovered pages are stored. After a crawler finds a page, the search engine renders it just like a browser would. In the process of doing so, the search engine analyzes that page’s contents. All of that information is stored in its index.

Read on to learn about how indexing works and how you can make sure your site makes it into this all-important database.

Can I see how a Googlebot crawler sees my pages?

Yes, the cached version of your page will reflect a snapshot of the last time googlebot crawled it.

Google crawls and caches web pages at different frequencies. More established, well-known sites that post frequently like will be crawled more frequently than the much-less-famous website for Roger the Mozbot’s side hustle, (if only it were real…)

You can view what your cached version of a page looks like by clicking the drop-down arrow next to the URL in the SERP and choosing “Cached”:

You can also view the text-only version of your site to determine if your important content is being crawled and cached effectively.

Are pages ever removed from the index?

Yes, pages can be removed from the index! Some of the main reasons why a URL might be removed include:

The URL is returning a “not found” error (4XX) or server error (5XX) – This could be accidental (the page was moved and a 301 redirect was not set up) or intentional (the page was deleted and 404ed in order to get it removed from the index)
The URL had a noindex meta tag added – This tag can be added by site owners to instruct the search engine to omit the page from its index.
The URL has been manually penalized for violating the search engine’s Webmaster Guidelines and, as a result, was removed from the index.
The URL has been blocked from crawling with the addition of a password required before visitors can access the page.

If you believe that a page on your website that was previously in Google’s index is no longer showing up, you can manually submit the URL to Google by navigating to the “Submit URL” tool in Search Console.

Ranking: How do search engines rank URLs?

How do search engines ensure that when someone types a query into the search bar, they get relevant results in return? That process is known as ranking, or the ordering of search results by most relevant to least relevant to a particular query.

To determine relevance, search engines use algorithms, a process or formula by which stored information is retrieved and ordered in meaningful ways. These algorithms have gone through many changes over the years in order to improve the quality of search results. Google, for example, makes algorithm adjustments every day — some of these updates are minor quality tweaks, whereas others are core/broad algorithm updates deployed to tackle a specific issue, like Penguin to tackle link spam. Check out our Google Algorithm Change History for a list of both confirmed and unconfirmed Google updates going back to the year 2000.

Why does the algorithm change so often? Is Google just trying to keep us on our toes? While Google doesn’t always reveal specifics as to why they do what they do, we do know that Google’s aim when making algorithm adjustments is to improve overall search quality. That’s why, in response to algorithm update questions, Google will answer with something along the lines of: “We’re making quality updates all the time.” This indicates that, if your site suffered after an algorithm adjustment, compare it against Google’s Quality Guidelines or Search Quality Rater Guidelines, both are very telling in terms of what search engines want.

What do search engines want?

Search engines have always wanted the same thing: to provide useful answers to searcher’s questions in the most helpful formats. If that’s true, then why does it appear that SEO is different now than in years past?

Think about it in terms of someone learning a new language.

At first, their understanding of the language is very rudimentary — “See Spot Run.” Over time, their understanding starts to deepen, and they learn semantics—- the meaning behind language and the relationship between words and phrases. Eventually, with enough practice, the student knows the language well enough to even understand nuance, and is able to provide answers to even vague or incomplete questions.

When search engines were just beginning to learn our language, it was much easier to game the system by using tricks and tactics that actually go against quality guidelines. Take keyword stuffing, for example. If you wanted to rank for a particular keyword like “funny jokes,” you might add the words “funny jokes” a bunch of times onto your page, and make it bold, in hopes of boosting your ranking for that term:

Welcome to funny jokes! We tell the funniest jokes in the world. Funny jokes are fun and crazy. Your funny joke awaits. Sit back and read funny jokes because funny jokes can make you happy and funnier. Some funny favorite funny jokes.

This tactic made for terrible user experiences, and instead of laughing at funny jokes, people were bombarded by annoying, hard-to-read text. It may have worked in the past, but this is never what search engines wanted.

The role links play in SEO

When we talk about links, we could mean two things. Backlinks or “inbound links” are links from other websites that point to your website, while internal links are links on your own site that point to your other pages (on the same site).

Links have historically played a big role in SEO. Very early on, search engines needed help figuring out which URLs were more trustworthy than others to help them determine how to rank search results. Calculating the number of links pointing to any given site helped them do this.

Backlinks work very similarly to real life WOM (Word-Of-Mouth) referrals. Let’s take a hypothetical coffee shop, Jenny’s Coffee, as an example:

Referrals from others = good sign of authority
Example: Many different people have all told you that Jenny’s Coffee is the best in town
Referrals from yourself = biased, so not a good sign of authority
Example: Jenny claims that Jenny’s Coffee is the best in town
Referrals from irrelevant or low-quality sources = not a good sign of authority and could even get you flagged for spam
Example: Jenny paid to have people who have never visited her coffee shop tell others how good it is.
No referrals = unclear authority
Example: Jenny’s Coffee might be good, but you’ve been unable to find anyone who has an opinion so you can’t be sure.

This is why PageRank was created. PageRank (part of Google’s core algorithm) is a link analysis algorithm named after one of Google’s founders, Larry Page. PageRank estimates the importance of a web page by measuring the quality and quantity of links pointing to it. The assumption is that the more relevant, important, and trustworthy a web page is, the more links it will have earned.

The more natural backlinks you have from high-authority (trusted) websites, the better your odds are to rank higher within search results.

The role content plays in SEO

There would be no point to links if they didn’t direct searchers to something. That something is content! Content is more than just words; it’s anything meant to be consumed by searchers — there’s video content, image content, and of course, text. If search engines are answer machines, content is the means by which the engines deliver those answers.

Any time someone performs a search, there are thousands of possible results, so how do search engines decide which pages the searcher is going to find valuable? A big part of determining where your page will rank for a given query is how well the content on your page matches the query’s intent. In other words, does this page match the words that were searched and help fulfill the task the searcher was trying to accomplish?

Because of this focus on user satisfaction and task accomplishment, there’s no strict benchmarks on how long your content should be, how many times it should contain a keyword, or what you put in your header tags. All those can play a role in how well a page performs in search, but the focus should be on the users who will be reading the content.

Today, with hundreds or even thousands of ranking signals, the top three have stayed fairly consistent: links to your website (which serve as a third-party credibility signals), on-page content (quality content that fulfills a searcher’s intent), and RankBrain.

What is RankBrain?

RankBrain is the machine learning component of Google’s core algorithm. Machine learning is a computer program that continues to improve its predictions over time through new observations and training data. In other words, it’s always learning, and because it’s always learning, search results should be constantly improving.

For example, if RankBrain notices a lower ranking URL providing a better result to users than the higher ranking URLs, you can bet that RankBrain will adjust those results, moving the more relevant result higher and demoting the lesser relevant pages as a byproduct.

Like most things with the search engine, we don’t know exactly what comprises RankBrain, but apparently, neither do the folks at Google.

What does this mean for SEOs?

Because Google will continue leveraging RankBrain to promote the most relevant, helpful content, we need to focus on fulfilling searcher intent more than ever before. Provide the best possible information and experience for searchers who might land on your page, and you’ve taken a big first step to performing well in a RankBrain world.

Engagement metrics: correlation, causation, or both?

With Google rankings, engagement metrics are most likely part correlation and part causation.

When we say engagement metrics, we mean data that represents how searchers interact with your site from search results. This includes things like:

Clicks (visits from search)
Time on page (amount of time the visitor spent on a page before leaving it)
Bounce rate (the percentage of all website sessions where users viewed only one page)
Pogo-sticking (clicking on an organic result and then quickly returning to the SERP to choose another result)

Many tests, including Moz’s own ranking factor survey, have indicated that engagement metrics correlate with higher ranking, but causation has been hotly debated. Are good engagement metrics just indicative of highly ranked sites? Or are sites ranked highly because they possess good engagement metrics?

What Google has said

While they’ve never used the term “direct ranking signal,” Google has been clear that they absolutely use click data to modify the SERP for particular queries.

According to Google’s former Chief of Search Quality, Udi Manber:

“The ranking itself is affected by the click data. If we discover that, for a particular query, 80% of people click on #2 and only 10% click on #1, after a while we figure out probably #2 is the one people want, so we’ll switch it.”

Another comment from former Google engineer Edmond Lau corroborates this:

“It’s pretty clear that any reasonable search engine would use click data on their own results to feed back into ranking to improve the quality of search results. The actual mechanics of how click data is used is often proprietary, but Google makes it obvious that it uses click data with its patents on systems like rank-adjusted content items.”

Because Google needs to maintain and improve search quality, it seems inevitable that engagement metrics are more than correlation, but it would appear that Google falls short of calling engagement metrics a “ranking signal” because those metrics are used to improve search quality, and the rank of individual URLs is just a byproduct of that.

What tests have confirmed

Various tests have confirmed that Google will adjust SERP order in response to searcher engagement:

Rand Fishkin’s 2014 test resulted in a #7 result moving up to the #1 spot after getting around 200 people to click on the URL from the SERP. Interestingly, ranking improvement seemed to be isolated to the location of the people who visited the link. The rank position spiked in the US, where many participants were located, whereas it remained lower on the page in Google Canada, Google Australia, etc.
Larry Kim’s comparison of top pages and their average dwell time pre- and post-RankBrain seemed to indicate that the machine-learning component of Google’s algorithm demotes the rank position of pages that people don’t spend as much time on.
Darren Shaw’s testing has shown user behavior’s impact on local search and map pack results as well.

Since user engagement metrics are clearly used to adjust the SERPs for quality, and rank position changes as a byproduct, it’s safe to say that SEOs should optimize for engagement. Engagement doesn’t change the objective quality of your web page, but rather your value to searchers relative to other results for that query. That’s why, after no changes to your page or its backlinks, it could decline in rankings if searchers’ behaviors indicates they like other pages better.

In terms of ranking web pages, engagement metrics act like a fact-checker. Objective factors such as links and content first rank the page, then engagement metrics help Google adjust if they didn’t get it right.

The evolution of search results

Back when search engines lacked a lot of the sophistication they have today, the term “10 blue links” was coined to describe the flat structure of the SERP. Any time a search was performed, Google would return a page with 10 organic results, each in the same format.

In this search landscape, holding the #1 spot was the holy grail of SEO. But then something happened. Google began adding results in new formats on their search result pages, called SERP features. Some of these SERP features include:

Paid advertisements
Featured snippets
People Also Ask boxes
Local (map) pack
Knowledge panel

And Google is adding new ones all the time. It even experimented with “zero-result SERPs,” a phenomenon where only one result from the Knowledge Graph was displayed on the SERP with no results below it except for an option to “view more results.”

The addition of these features caused some initial panic for two main reasons. For one, many of these features caused organic results to be pushed down further on the SERP. Another byproduct is that fewer searchers are clicking on the organic results since more queries are being answered on the SERP itself.

So why would Google do this? It all goes back to the search experience. User behavior indicates that some queries are better satisfied by different content formats. Notice how the different types of SERP features match the different types of query intents.

Query Intent

Possible SERP Feature Triggered


Featured Snippet

Informational with one answer

Knowledge Graph / Instant Answer


Map Pack



We’ll talk more about intent in Chapter 3, but for now, it’s important to know that answers can be delivered to searchers in a wide array of formats, and how you structure your content can impact the format in which it appears in search.

Localized search

A search engine like Google has its own proprietary index of local business listings, from which it creates local search results.

If you are performing local SEO work for a business that has a physical location customers can visit (ex: dentist) or for a business that travels to visit their customers (ex: plumber), make sure that you claim, verify, and optimize a free Google My Business Listing.

When it comes to localized search results, Google uses three main factors to determine ranking:


Relevance is how well a local business matches what the searcher is looking for. To ensure that the business is doing everything it can to be relevant to searchers, make sure the business’ information is thoroughly and accurately filled out.


Google use your geo-location to better serve you local results. Local search results are extremely sensitive to proximity, which refers to the location of the searcher and/or the location specified in the query (if the searcher included one).

Organic search results are sensitive to a searcher’s location, though seldom as pronounced as in local pack results.


With prominence as a factor, Google is looking to reward businesses that are well-known in the real world. In addition to a business’ offline prominence, Google also looks to some online factors to determine local ranking, such as:


The number of Google reviews a local business receives, and the sentiment of those reviews, have a notable impact on their ability to rank in local results.


A “business citation” or “business listing” is a web-based reference to a local business’ “NAP” (name, address, phone number) on a localized platform (Yelp, Acxiom, YP, Infogroup, Localeze, etc.).

Local rankings are influenced by the number and consistency of local business citations. Google pulls data from a wide variety of sources in continuously making up its local business index. When Google finds multiple consistent references to a business’s name, location, and phone number it strengthens Google’s “trust” in the validity of that data. This then leads to Google being able to show the business with a higher degree of confidence. Google also uses information from other sources on the web, such as links and articles.

Check a local business’ citation accuracy here.

Organic ranking

SEO best practices also apply to local SEO, since Google also considers a website’s position in organic search results when determining local ranking.

In the next chapter, you’ll learn on-page best practices that will help Google and users better understand your content.

[Bonus!] Local engagement

Although not listed by Google as a local ranking determiner, the role of engagement is only going to increase as time goes on. Google continues to enrich local results by incorporating real-world data like popular times to visit and average length of visits…

…and even provides searchers with the ability to ask the business questions!

Undoubtedly now more than ever before, local results are being influenced by real-world data. This interactivity is how searchers interact with and respond to local businesses, rather than purely static (and game-able) information like links and citations.

Since Google wants to deliver the best, most relevant local businesses to searchers, it makes perfect sense for them to use real time engagement metrics to determine quality and relevance.

You don’t have to know the ins and outs of Google’s algorithm (that remains a mystery!), but by now you should have a great baseline knowledge of how the search engine finds, interprets, stores, and ranks content. Armed with that knowledge, let’s learn about choosing the keywords your content will target!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Laying the foundations of good SEO: the most important tasks (part 2)

Posted by on Jul 25, 2018 in SEO Articles | Comments Off on Laying the foundations of good SEO: the most important tasks (part 2)

SEO is not easy to master. It keeps evolving, with new specifics and techniques added almost daily.

However, it is imperative to lay the foundations of good SEO by accomplishing time-tested tasks that both beginner and advanced SEOs usually follow in their daily routines.

In part one of this series, the author explained the most important SEO tasks related to SEO tools, keyword research, and on-site optimization.

It is now time to tackle technical SEO, content, and off-page optimization.

Technical SEO

Technical SEO is vital to your site’s success. For example, one single error in your robots.txt file can prevent your site from being indexed.

Though technical SEO covers a broad range of subjects concerning the elements needed for optimization, you should primarily prioritize the following areas:

Crawl errors. If you keep receiving Crawl error reports, this means that Google cannot crawl your site’s URLs and, consequently, cannot rank it. Regularly check Crawl error notifications in the Google Search Console and be sure to fix them as soon as possible

Google’s access to pages. Unfortunately, crawling error reports do not necessarily indicate that all unreported pages have been scanned and indexed nicely. Sometimes Google is unable to access a page at all. Regularly check all of your pages to ensure they are visible to Google in the Google Search Console

Broken links. Broken links are a big red flag to Google. Regular checks for broken links should become a vital part of your SEO routine. Fortunately, there is no shortage of tools to find and fix broken links (e.g. Netpeak Spider, Serpstat, Screaming Frog SEO Spider)

HTTPS has been a ranking signal since 2014, so if you want to give your site an SEO boost, implement HTTPS. Additionally, it will provide your site with a layer of security, which your visitors will appreciate. Check out this guide to make sure you correctly migrate from HTTP to HTTPS

Duplicate metatags. Google doesn’t appreciate duplicates of anything — be it content, URLs, or metatags. So access the Google Search Console and go to the HTML Improvements tab to find duplicates of title tags and meta descriptions, and then fix them

Mobile-friendliness. Google has started to use the mobile version of a page for indexing and ranking, so making your site’s pages mobile-friendly is highly advised. If your site is not optimized for mobile, it can still rank nicely in search, but its chances of topping the SERPs will not be as high. So test your pages and make them mobile-friendly

Loading speed. Page speed for both desktop and mobile is a key ranking factor. Make sure to check your website’s loading speed with the PageSpeed Insights tool, and then move forward by implementing any optimization suggestions it provides. Otherwise, you may risk appearing lower in the SERPs.


Content is the fuel that feeds Google. Content is an important ranking factor, and sites that consistently craft high-quality content are more likely to have better rankings (but it is not a single decisive ranking factor).

To succeed with content SEO-wise, you should prioritize the following areas:

Duplicate content. Duplicate content is a big no-no. Check your site with Copyscape or Siteliner to find all pages that are similar or have content that is partially featured at a third-party website (i.e. plagiarized content). Otherwise, a Google penalty is inevitable

Keyword use. Optimizing your content around core keywords (or a set of keywords) might be hard, but it is absolutely necessary to get your content seen by your target audiences. Figure out which core and support keywords to place on specific pages of your website, and track their performance regularly
Content structure. It is hardly a secret that customers skim, rather than read content. For this reason it’s important to properly structure your content:

Use shorter sentences
Break down longer paragraphs
Use subtitles and bulleted lists
Include multimedia elements, such as images, videos, GIFs and audio files.
This will make your content easier to digest and should keep visitors on a page for longer.

Audience personas. Never start putting together a piece of content without having a clear picture of your audience persona in mind, including gender, age, occupation, responsibilities, challenges, and problems they need to solve. Additionally, be mindful of the stage in the buyers’ journey your audience may be at, and enhance your content accordingly

In-depth, high-quality content. The importance of high-quality content cannot be overemphasized. If you cannot produce quality content, you will lose a considerable portion of your ranking potential in Google and any other search engine. Put out expert content that is supported by data and your own research, such as surveys, reviews, links or traffic analysis. Only high-quality content will matter to your readers, journalists, and eventually to search engines

Schema markup. Want to help Google understand your content better? Use Schema markup. Though it is not a ranking factor per se, adding rich snippets may increase CTR and, accordingly, benefit your appearance in the SERPs. To do the heavy lifting, Google offers its Structured Data Testing tool

Multimedia elements. The more images, videos, GIFs, Twitter embeds, and other visuals you use in your content, the better. They allow you to illustrate your point, keep users on the page, and help your site to rank better.

Off-site optimization

Links are the bread and butter of SEO. No matter how important other ranking signals may become, links will remain crucial to calculating a website’s ranking in the SERPs, since they are viewed as external citation authority.

Off-site optimization and link building are not easy to master. To increase ranking, you need to attract high-quality backlinks from relevant, trustworthy resources – and links like these are not easy to come by.

Here are a few practical steps you can take to start driving backlinks:

Analysis of existing links. Before building new backlinks, look through existing ones. It will help you:

Understand which sites link back to you
See which pages attract backlinks and which do not
Disavow backlinks that negatively impact your appearance in the SERPs because of their lack of relevance and trustworthiness
Delete broken backlinks.
To run the analysis, you can use Ahrefs, MajesticSEO, Netpeak Spider, or any other backlink analysis tool you may have access to

Analysis of competitor links. One of the most efficient approaches to driving backlinks from your niche is to analyze your competitors’ backlinks and emulate their strategy. Your goal is to find which websites and pages are most linked to, and which of the linking sites drive the most traffic. After that, you need to drive backlinks from the top-performing resources

Guest posting. Featuring your content in established media and on respected sites is one of the best long-term strategies for driving high-quality links and traffic to your website. Get invested with guest posting, and you may never run out of backlinks

The problem with guest posting is that you need to become a contributor first, and only then will you get a coveted author box with a link to your website. To get on the radar of established media or industry thought leaders, you have to master outreach – featured posts, links, and mentions will follow (provided your guest posts are good enough)

Directories and listings. Registering your business on directories and listings is the easiest way to improve your site’s positions in local search. All you need to do is identify top-tier business directories in your niche, fill out and optimize your directory accounts, and make sure that all information you submit is consistent across all directories, listings, and CDAs

Google My Business. A claimed and properly optimized account at Google My Business (and Bing Places for Business) can make all the difference for your company. It will help you secure a spot in Google’s local three-pack, which means more local traffic and improved rankings

Link-worthy content. Finally, it is worth noting that content can make or break your link building efforts, specifically when it comes to outreach and guest posting. You will not be able to create contributor accounts and garner backlinks if your content is repetitive and does not offer actionable advice to users. You need to stand out to succeed in the content department.


In this article the author has shared perspectives on the most important SEO tasks with regard to technical SEO, content, and off-page optimization.

These three areas of SEO knowledge are essential to master if you want to succeed in increasing your site’s SERPs, driving traffic, and attracting valuable leads. However, bear in mind that you need to set up and fine-tune your SEO tools, do your keyword research, and improve your on-page SEO first.

Skip to content