Blog

SEO for JavaScript-powered websites (Google IO 18 summary)

Posted by on Jul 16, 2018 in SEO Articles | Comments Off on SEO for JavaScript-powered websites (Google IO 18 summary)

SEO for JavaScript-powered websites (Google IO 18 summary)

SEO for JavaScript-powered websites (Google IO 18 summary)

You have probably heard that in the recent Google I/O 18, Google shed some light on SEO.

Tom Greenaway and John Muller of Google presented a session about making your modern JavaScript powered websites search friendly.

They actually listed some recommended best practices, useful tools, and Google policy change.

Here’s the thing:

In a pretty un-Google like way, the duo also shed some light on how the actual crawl and index process for javascript websites work.

Check out the video here:

But, if you don’t want to spend 40 minutes watching the recording.

Hang around, cause here’s a quick summary of the important key points of the session.

A brief background introduction on the presenters…

Tom Greenaway is a senior developer advocate from Australia. While John Mueller (aka johnmu, ring a bell?), is Google’s webmaster trends analyst from Zurich, Switzerland.

How does crawl, render and index works for JavaScript powered websites?

Tom started the talk by sharing a little background of search engines.

Here’s the deal,

The purpose of search engines is to provide a relevant list to answer user’s queries. A library of web pages is compiled where answers are pulled from.

That library is the index.

Building an index starts with a crawlable URL.

Now, the crawler is designed to find contents to crawl.

But, in order to do this, the content must be retrievable via an URL. When the crawler gets to an URL, it will look through the HTML to index the page as well as find new links to crawl.

Here’s a diagram on how search works for Google.

So how do you make sure that your content is reachable for the Googlebot?

Here’s what you need to know, Tom shared the six steps to ensure your web page will be indexed.

1. Make sure that your URL is crawlable
– Set up robots.txt at the top level domain of your site. Robots.txt is useful to let Googlebot know which URLs to crawl and which to ignore.

2.Utilize canonical tags
– In case of content syndication where a content is distributed on different sites to maximize exposure. The source document should be tagged as the canonical document.

3. Make sure the URL is clean and unique
– Don’t list session information on the URL.

4.Provide a sitemap to Googlebot
– That way the crawler has a list of URLs to crawl and you can sleep better at night knowing your website is properly crawled.

5. Use history API
– It replaces the hashbang tag(#!), which, if used will no longer be indexed.

6. Make sure your links have anchor tags with HREF attributes
– Googlebot only recognizes links with BOTH anchor tags and HREF attributes, otherwise, they won’t be crawled therefore never indexed.

What’s more important is,

Tom said Google has been encountering a list of problems trying to crawl and index websites that are built using Javascript.

Here’s the list of most commonly face problem for javascript website indexing

Make sure to have a good look at it, don’t wanna be repeating these same mistakes.

1. HTML delivered from the server is devoid of any content…
– Which leads Googlebot to assume that there’s nothing to index.

2. Lazy loading images are only sometimes indexable
– Make sure that they are properly indexed,use noscript tag or structured data.
– Take caution, images only referenced through CSS are not indexed.

3. Any contents that are triggered via an interaction won’t be indexed
-Googlebot is not an interactive bot, which means he won’t go around clicking tabs on your website. Make sure the bot can get to all your stuff by either preloading the content or CSS toggling visibility on and off.
– What’s better, just use separate URLs to navigate user and Googlebot to those pages individually.

4. Rendering timeout
– Make sure your page is efficient and performant by limiting the number of embedded resources and avoid artificial delays such as time interstitials.

5. API that store local information is not supported.
– What happens instead is that Googlebot crawls and renders your page in a stateless way.

Now, due to the increasingly widespread use of JavaScript, there is another step added between crawling and indexing. That is rendering.

Rendering is the construction of the HTML itself.

Like mentioned before, the crawler needs to sift through your HTML in order to index your page. JavaScript-powered websites need to be rendered before it can be indexed.

According to Tom and John, Googlebot is already rendering your JavaScript websites.

What we can make out of the rendering process and indexing process for a JavaScript website is as below.

1. Googlebot uses the Chrome 41 browser for rendering
-Chrome 41 is from 2015 and any API added after Chrome 41 is not supported.

2. Rendering of JavaScript Websites in Search is deferred
– Rendering web pages is a resource heavy process, therefore rendering might be delayed for a few days until Google has free resources.

3. Two-phase indexing
– First indexing happens before the rendering process is complete. After final render arrives there will be a second indexing.
– The second indexing doesn’t check for canonical tag so the initially rendered version needs to include the canonical link, or else Googlebot will miss it altogether.
– Due to the nature of two-phase indexing, the indexability, metadata, canonical tags and HTTP codes of your web pages could be affected.

John Mueller takes the baton and shares with us some basic information on rendering.

What’s important is, he shared with the crowd which is the preferred rendering method of Google.

Client side, server side, hybrid and dynamic rendering.

1. Client side rendering
– This is the traditional state where the rendering happens on the browser of users or on a search engine.

2. Server side rendering
– Your server deals with the rendering and serve users and search engine alike static HTML.

3. Hybrid rendering (the long-term recommendation)
– Pre-rendered HTML is sent to users and search engine. Then, the server adds JavaScript on top of that. For the search engine, they will simply pick up the pre-rendered HTML content.

4. Dynamic rendering (the policy change & Google’s preferred way)
– This method sends client side rendered contents to users while search engines got server side rendered content.
– This works in the way that your site dynamically detects whether its a search engine crawler request.
– Device focused contents need to be served accordingly (desktop version for the desktop crawler and mobile version for the mobile crawler).

How hybrid rendering works.

Now that it is out in the open that Google prefers the (NEW) dynamic rendering method to help the crawling, rendering and indexing of your site. John also gives a few suggestions on how to implement dynamic rendering.

Ways to implement dynamic rendering

1. Puppeteer
– A Node.js library, which uses a headless version of Google Chrome that allows you to render pages on your own server.

2. Rendertron
– Could be run as a software or a service that renders and caches your content on your side.

Both of these are open source projects where customization is abundant.

John also advises that rendering is resource extensive, so do it out of band from your normal web server and implement caching where needed.

The most important key point of dynamic rendering is this,

it has the ability to recognize a search engine request from a normal user request.

But how could you recognize a Googlebot request?

The first way is to find Googlebot in the user-agent string.
The second way is to do a reverse DNS lookup.

John stresses during the session that implementing the suggested rendering methods is not a requirement for indexing.

What it does, is it makes the process crawling and indexing process easier for Googlebot.

Considering the resource needed to run server side rendering, you might want to consider the toll before implementing.

So when do you need to have dynamic rendering?

Here’s what,

When you have a large and constantly updated website like a news portal because you want to be indexed quickly and correctly.

Or, when you’re relying on a lot of modern JavaScript functionality that is not supported by Chrome 41, which means Googlebot won’t be able to render them correctly.

And finally, if your site relies on social media or chat applications that require access to your page’s content.

Now let’s look at when you don’t need to use dynamic rendering.

The answer is simple,

if Googlebot can index your pages correctly, you don’t need to implement anything.

So how can you know whether Googlebot is doing their job correctly?

You can employ a progressive checking.

Keep in mind that you don’t need to run tests on every single web pages. Test perhaps two each from a template, just to make sure they are working fine.

So here’s how to check whether your pages are indexed

1. Fetch as Google on Google Search Console after verifying ownership, this will show you the HTTP response before any rendering as received by Googlebot.

2. Run a Google Mobile Friendly Test.

Why?

Because of the mobile-first indexing that is being rolled out by Google where mobile pages will be the primary focus of indexing. If the pages render well in the test, it means Googlebot can render your page for Search

3. Keep an eye out for the new function in the mobile friendly test. It shows you the Googlebot rendered version and full information on landing issue in case it doesn’t render properly.

4. You can always check the developer console when your page fails in a browser. In developer console, you can access the console log when Googlebot tries to render something. Which allows you to check for a bunch of issues.

5. All the diagnostics can also be run in the rich results test for desktop version sites.

At the end of the session, John also mentions some changes that will happen.

The first happy news,

Google will be moving rendering closer to crawling and indexing.

Which we can safely assume that it will mean that the second indexing will happen much quicker than before.

The second happy news,

Google will make Googlebot use a more modern version of Chrome. Which means a wider support of APIs.

They do make it clear that these changes will not happen until at least the end of the year.

To make things easier, here are the four steps to make sure your JavaScript-powered website is search friendly.

With that, the session is concluded. Do check out our slide show for a quick refresh.

All in all, Google is taking the mic and telling you exactly what they want.

Better take some note.

Delivering search friendly java script-powered websites (Google io 18 summary) from Jia Thong Lo

#optin-template-3{
float: left;
margin: 0;
width: 100%;
max-width: 654px;
height: 100%;
}
#optin-template-3 .container{
float: left;
width: 100%;
height: 100%;
text-align: center;
background: #fff;
padding-bottom: 16px;
}
#optin-template-3 .top-row{
display: inline-block;
width: 88%;
padding: 3% 6% 0%;
}
#optin-template-3 .top-row h2{
margin: 5px 0 0;
font-family: “roboto”, helvetica, sans-serif;
color: #FFAB40;
font-weight: 600;
text-align: center;
padding:0px 0px 5px;
font-size:2.2em;
}
#optin-template-3 .left-column{
display: inline-block;
width: 100%;
max-width: 270px;
min-width: 270px;
height: 100%;
vertical-align: top;
padding-top: 32px;
}
#optin-template-3 .ebook-img{
width: 100%;
min-width:270px;
height: 280px;
background: url(https://seopressor.com/wp-content/uploads/2016/09/SEO-Tools.png);
background-size: contain;
background-repeat: no-repeat;
}
#optin-template-3 .right-column{
display: inline-block;
width: 60%;
min-width: 250px;
max-width: 305px;
padding: 24px 4% 32px;
}
#optin-template-3 .bodycopy ul{
text-align: left;
padding-left: 0;
}
#optin-template-3 .bodycopy ul li{
font-family: “roboto”, helvetica, sans-serif;
margin-left: 20px;
}
#optin-template-3 .optIn-form{
display: block;
bottom: 0;
}
#optin-template-3 .email{
display: block;
width: 100%;
border: 0;
padding: 8px 0;
font-size: 18px;
text-align: center;
border: 1px solid #FFAB40;
}
#optin-template-3 .submit-button{
display: block;
margin-top: 4%;
width: 100%;
padding: 8px 0;
font-family: “roboto”, helvetica, sans-serif;
font-weight: 400;
color: #fff;
background: #FFAB40;
font-size: 21px;
border: 0;
outline: 1px solid #FFAB40;
cursor: pointer;
}
#optin-template-3 .container .align-justify { text-align:justify !important;}

18 Essential SEO Tools To Optimize Your Website

An up-to-date list of SEO tools for every marketer to optimize your website.
Identify 18 practical tools that save your time to optimize manually
Get more traffic and higher ranking with these tools
Discover the benefits of every tool to help strengthen your SEO strategy

The Hierarchy of Evidence for Digital Marketing Testing

Posted by on Jul 15, 2018 in SEO Articles | Comments Off on The Hierarchy of Evidence for Digital Marketing Testing

The Hierarchy of Evidence for Digital Marketing Testing

In the two-and-a-bit years that I’ve been working in Digital marketing, I’ve been keen to understand the reasons why we make the decisions we do in digital marketing. There’s a wide variety of ways to approach a problem, and although many of them have value, there has to be a best way to make sure that the decision you make is the right one. I wanted to take my evidence-first way of thinking and apply it to this field.

In a previous life, I worked in clinical science, specifically in the field of medical physics. As part of this I was involved in planning and carrying out clinical trials, and came across the concept of a ‘hierarchy of evidence’. In clinical research, this refers to the different standards of evidence that can be used to support a claim – be that a new drug, piece of technology, surgical technique or any other intervention that is claimed to have a beneficial effect – and how they are ranked in terms of strength. There are many different formulations of this hierarchy, but a simple version can be seen here:

According to this ordering, a systematic review is the best type of evidence. This involves taking a look across all of the evidence provided in clinical trials, and negates the effects of cherry-picking – the practice of using only data that supports your claim, and ignoring negative or neutral results. With a systematic review we can be sure that all of the evidence available is being represented. A randomised controlled trial is a method of removing any extraneous factors from your test, and of making sure that the effect you’re measuring is only due to the intervention you’re making.

This is opposed to case-control reports, which involve looking at historical data of two populations (e.g. people who took one drug vs. another) and seeing what their outcomes were. This has its uses when it is not possible to carry out a proper trial, but it is vulnerable to correlations being misidentified as causation. For example, patients who were prescribed a certain course of treatment may happen to live in more affluent areas and therefore have hundreds of other factors causing them to have better outcomes (better education, nutrition, less other health problems etc.).

All of these types of tests should be viewed as more authoritative than the opinion of anyone, regardless of how experienced or qualified they are. Often bad practices and ideas are carried on without being re-examined for a long time, and the only way we can be sure that something works is to test it. I believe that this is also true in my new field.

A hierarchy of evidence for digital marketing

While working at Distilled, I’ve been thinking about how I can apply my evidence-focussed mindset to my new role in digital marketing. I came up with the idea for a hierarchy of evidence for digital marketing that could be applied across all areas. My version looks like this:

A few caveats before I start: this pyramid is by no means comprehensive – there are countless shades of grey between each level, and sometimes something that I’ve put near the bottom will be a better solution for your problem than something at the top.

I’ll start at the bottom and work my way up from worst to best standards of evidence.

Hunches

Obviously, the weakest form of evidence you can use to base any decision on is no evidence at all. That’s what a hunch is – a feeling that may or may not be based on past experience, or just what ‘feels right’. But in my opinion as a cold-hearted scientist, evidence nearly always trumps feelings. Especially when it comes to making good decisions.

Having said that, anyone can fall into the trap of trusting hunches even when better evidence is available.

Best practice

It’s easy to find brilliant advice on the ‘best practice’ for any given intervention in digital marketing. A lot of it is brilliant advice (for example DistilledU) but that does not mean that it is enough. No matter how good best practice advice is, it will never compare to evidence tailored to your specific situation and application. Best practice is applicable to everything, but perfect for nothing.

Best practice is nevertheless a good option when you don’t have the time or resources to perform thorough tests yourself, and it plays a very important role when deciding what direction to push tests in.

Anecdotal evidence

A common mistake in all walks of life is thinking that just because something worked once before, it will work all of the time. This is generally not true – the most important thing is always data, not anecdotes. It’s especially important not to assume that a method that worked once will work again in this field, as we know things are always changing, and every case is wildly different.

As with the above example of best practice advice, anecdotal evidence can be useful when it informs the experimentation you do in the future, but it should not be relied on on its own.

Uncontrolled/badly controlled tests

You’ve decided what intervention you want to make, you’ve enacted it and you’ve measured the results. This sounds like exactly the sort of thing you should be doing, doesn’t it? But you’ve forgotten one key thing – controls! You need something to compare against, to make sure that the changes you’re seeing after your intervention are not due to random chance, or some other change outside of your control that you haven’t accounted for. This is where you need to remember that correlation is not causation!

Almost as bad as not controlling at all is designing your experiment badly, such that your control is meaningless. For example, a sporting goods ecommerce site may make a change to half the pages on its site, and measure the effect on transactions. If the change is made on the ‘cricket’ category just before the cricket season starts, and is compared against the ‘football’ category, you might see a boost in sales for ‘cricket’ which is irrelevant to the changes you made. This is why, when possible, the pages that are changed should be selected randomly, to minimise the effect of biases.

Randomised controlled trials (A/B testing)

The gold standard for almost any field where it’s possible is a randomised controlled trial (RCT). This is true in medicine, and it’s definitely true in digital marketing as well, where they’re generally referred to as A/B tests. This does not mean that RCTs are without flaws, and it is important to set up your trial right to negate any biases that might creep in. It is also vital to understand the statistics involved here. My colleague Tom has written on this recently, and I highly recommend reading his blog post if you’re interested in the technical details.

A/B testing has been used extensively in CRO, paid media and email marketing for a long time, but it has the potential to be extremely valuable in almost any area you can think of. In the last couple of years, we’ve been  putting this into practice with SEO, via our DistilledODN tool. It’s incredibly rewarding to walk the walk as well as talking the talk with respect to split testing, and being able to prove for certain that what we’re recommending is the right thing to do for a client.

Sign up to find out more about our new ODN platform, for a scientific approach to SEO.

//

The reality of testing

Even with a split test that has been set up perfectly, it is still possible to make mistakes. A test can only show you results for things you’re testing for: if you don’t come up with a good intervention to test, you won’t see incredible results. Also, it’s important not to read too much into your results. Once you’ve found something that works brilliantly in your test, don’t assume it will work forever, as things are bound to change soon. The only solution is to test as often as possible.

If you want to know more about the standards of evidence in clinical research and why they’re important, I highly recommend the book Bad Science by Ben Goldacre. If you have anything to add, please do weigh in below the line!

5 Google SERP Features That Can Give You Tons of Traffic

Posted by on Jul 15, 2018 in SEO Articles | Comments Off on 5 Google SERP Features That Can Give You Tons of Traffic

5 Google SERP Features That Can Give You Tons of Traffic

With how dynamic the SERP results (Search Engine Result Page) has become, there’s actually much more than being ranked first. Now, there are more ways to appear on the Google SERP other than being a blue link. Especially more so on mobile.

So today I’m sharing with you a list of SERP features that you should target on. Five plus one extra.

What are SERP features?

Other than the standard blue links, Google is offering us the users so much more.

You can type in square root of 48 in the search bar. You will get a calculator with the answer on it. Which is pretty cool and convenient. When I asked Google how old is Obama, other than an instant answer, there is also a list of prominent figures related to Obama with their ages attached.

Google whips up a calculator to answer your math question.

What this means is, Google is getting better and better at giving the users what they want. And they know a list of blue links on a white backdrop is not going to cut out any longer. That’s why they introduced and constantly updates a bunch of SERP features.

Now, you might think why does it concern you? Well because you can actually take advantage of some of those unique SERP features and give yourself more exposure. Therefore, gaining more traffic.

Well I didn’t asked for Trump, Clinton or Michelle Obama’s age. But it’s nice to know anyways, thanks Google!

Fighting for the page one rank is proving to be getting much harder than before. That’s why we’re gonna fight smart.

Let’s get started with the 5 SERP features that can give you tons of traffic.
1. GOOGLE ADWORDS

This is a no-brainer. Wanna get listed on page one, even higher than the number one ranked organic result? Well, Google got your back, provided you pay them some money (oops).

But if you think placing ads is as easy as paying and watching traffic trickling in. Think again. Planning an AdWords campaign is actually an intense game of keyword researching.

Just like trying to rank for your page, you need to identify the point keywords. Or even better, niche keywords that will get you dedicated and relevant visitors.

There are two types of keywords that you can target. Short tail keywords may have more search volume, but they are also highly competitive and expensive. While long tail keywords need you to do more in-depth research and they might not drive that much traffic. But what you are going to get are dedicated visitors.

You might think, is it worth it to pay that much money for ads? Well, ads have been around for decades and it is not going away. I would recommend fresh business that needs some exposure to use ads.

Content and inbound marketing are great. But when you can’t afford the “slow and steady win the race” method, placing some strategic ads is gonna help you boost the traffic, and build an audience.

2. FEATURED SNIPPET

Hey isn’t that OUR article?

Ever heard of the featured or SERP snippet? Do you wonder what is SERP snippet? Well, ever searched for something and got a little box of information on top that answers your question right away? That lifesaver is the featured snippet aka another spot on SERP that you can grab.

The featured snippet is curated by Google Knowledge Graph from their library of indexed contents. In order to answer a specific query, Google extracts and reformats relevant information from a single page that has the most fitting answer. Then dumps (well, perhaps more gracefully than dump) it into a box on the top of SERP.

Also called position zero, featured snippet is placed right on top of the SERP. Yes, even before the number 1 ranked web page. According to HubSpot, the featured snippet has almost two times the click-through rate compared to other links listed in the SERP. Which means a whole lot of traffic.

Click through rate for featured snippet is almost twice as much.

Now that we know how much traffic featured snippet can drive, how can we get that spot?

If you have an article that’s at the top 5 spots on SERP, you are already halfway there. Now all that’s left is to ask yourself. Did your article answer the question in a straightforward and concise way?

Well, Google is smart, but it doesn’t mean they don’t need a little push. Editing your content a little to help it fits in the box will go a long way. Both for your visibility and traffic. Now go get that featured snippet spot.

3. GOOGLE IMAGE

Infographics and graphs tend to do well on image search.

A picture is worth a thousand words. There is a reason why the image tab is right beside the all tab on the SERP. Image is actually one of the regular features of SERPS that is largely ignored.

There are those who want to read a 5 thousand words in-depth discussion of a single topic. There are also those who want their answer in a glance, lined in a perfectly constructed chart, or graph or table (you get the idea). There are also those who will only hire your service after they have a look at your work.

Your image will also make it into the featured snippet box. Note that the snippet itself is extracted from another website.

So if you can sketch up a mean infographic or graph, with the perfect ratio of graphics and information, you might have just gotten the key to unlocking a bunch of traffics. Moreover, if you’re an interior designer, hotelier, jeweler, or even a pâtissier, where being able to show your work is important in securing clients, targeting Google image ranking will definitely help.

Now, how can you optimize for Google image searches? If you are into SEO you are probably aware of how important image ALT is. If you want to rank for Google image, there is a list of optimization that you need to do.

Relevant image file name. You don’t want it to be Q2RTZ4GP, let’s make it macaroon-pauls-patisserie instead.

Optimized ALT tag. ALT attributes are used as the text alternative to your picture. Think of it as a description of the image for search engines.

Informative image caption. Captioning an image and draw relevance to the topic not only helps the visitor to understand it better, it also means better optimization.

Inserted in a relevant and optimized page. Image from pages that rank high tends to rank higher on Image search too.

Make sure your image dimension is not out of the norm. Stick to the good old 16 x 9 or 4 x 3.

No heavy image size, please. Keep it under 1MB cause who’s going to look at an image that takes more than a second to load?

Share it on multiple platforms. Be it Flickr, ImageShack, Twitter or Reddit, just spread it like butter.

4. LOCAL SEARCH PACK

Now we’re threading a bit on local SEO water.

One of the most frequently used format on Google search is … near me. The ellipsis could be a restaurant, a plumber service, a hardware store. Basically, any local businesses that are in a close proximity to the user.

Being listed on local search pack is important for small local business cause that means tons of exposure.

Google will basically try to list all the local businesses. But you need to claim it in order to optimize it and use it to your advantage. In the rare case where your business is not on Google’s radar, simply create it. Here is a guide on how to get started on Google My Business (GMB).

After setting up your GMB account doesn’t mean the job is done. The most important factor of this SERP feature is, of course, the proximity of your listing to the user. However, there is also an extensive list of information and setting that you can set up to increase your ranking.

Moz compiled a list of Top 30 Foundational Factors for you to rank on the local search pack. If interested, you can read on their extensive local search ranking factors.

5. PEOPLE ALSO ASK

Google tries to give the best answer to every query, and they try hard. People also ask is one of the SERP features where they try to give the user the most detailed and relevant answer. By suggesting you what similar questions other have asked.

Every question bar you clicked on will show you the answer while simultaneously load more question bars at the bottom that relates to the question you just clicked on. You can go on and on and on and you can basically get all your answers.

Google shows you the answer when you click on the respective question box.

Sometimes, the answer presented is actually the featured snippet of that exact query. Some other times, they are not.

We are not sure how Google decide which to feature and which to not feature. However, one thing that is definite is the page gotta answer the question.

The actual featured snippet of the exact People also ask query in the actual SERP is not the same one displayed in the question box.

The answer featured in the People also ask box is actually extracted from the 10th link of the SERP.

Like all quality content, you need to have a solid and concise writing that is relevant to your targeted keywords.

People also ask, like featured snippet is placed above the organic 10 blue links on white background. That means a whole lot of exposure and also a whole lot of traffic that could be yours.

EXTRA: YOUTUBE VIDEOS

Entertainment queries tend to trigger more YouTube video results.

Youtube is a Google propriety which means it makes perfect sense for them to be featured on the SERP. Featuring videos in the SERP is one of the steps that Google has taken to give a more dynamic result. Some things are just better explained in motion and utterance.

Now, there are some queries out there that can trigger a SERP with half of the results being a video from YouTube. There are also some queries that won’t show you a single video at all.

Informational videos, tutorials, how-tos, reviews and entertainment queries are the most likely to trigger a video result.

So how do you link the youtube videos back to your main website and gain traffic? Make good use of the description box function. Or mention your domain in the video. Videos can be a good way to build up your brand image and authority.

So it might not have a direct impact on your website’s organic traffic. It is another way to gain a place of the Google SERP and spread your brand name.

5 Google SERP Features That Can Give You Tons of Traffic from Jia Thong Lo

Here’s a quick slideshow to help you refresh what you have just read. Now go get those traffics!

#optin-template-3{
float: left;
margin: 0;
width: 100%;
max-width: 654px;
height: 100%;
}
#optin-template-3 .container{
float: left;
width: 100%;
height: 100%;
text-align: center;
background: #fff;
padding-bottom: 16px;
}
#optin-template-3 .top-row{
display: inline-block;
width: 88%;
padding: 3% 6% 0%;
}
#optin-template-3 .top-row h2{
margin: 5px 0 0;
font-family: “roboto”, helvetica, sans-serif;
color:#232b33;
font-weight: 600;
text-align: center;
padding:0px 0px 5px;
font-size:2.2em;
}
#optin-template-3 .left-column{
display: inline-block;
width: 100%;
max-width: 270px;
min-width: 270px;
height: 100%;
vertical-align: top;
padding-top: 32px;
}
#optin-template-3 .ebook-img{
width: 100%;
min-width:270px;
height: 280px;
background: url(https://seopressor.com/wp-content/uploads/2015/09/inbound-marketer-cover2.png);
background-size: contain;
background-repeat: no-repeat;
}
#optin-template-3 .right-column{
display: inline-block;
width: 60%;
min-width: 250px;
max-width: 305px;
padding: 24px 4% 32px;
}
#optin-template-3 .bodycopy ul{
text-align: left;
padding-left: 0;
}
#optin-template-3 .bodycopy ul li{
font-family: “roboto”, helvetica, sans-serif;
margin-left: 20px;
}
#optin-template-3 .optIn-form{
display: block;
bottom: 0;
}
#optin-template-3 .email{
display: block;
width: 100%;
border: 0;
padding: 8px 0;
font-size: 18px;
text-align: center;
border: 1px solid #795548;
}
#optin-template-3 .submit-button{
display: block;
margin-top: 4%;
width: 100%;
padding: 8px 0;
font-family: “roboto”, helvetica, sans-serif;
font-weight: 400;
color: #232b33;
background: #b7daff;
font-size: 21px;
border: 0;
outline: 1px solid #b7daff;
cursor: pointer;
}

Guide in Becoming The Perfect Inbound Marketer

10 must-have qualities of a perfect inbound marketer.
Proven examples on applying the best inbound marketing techniques.
Compact guide with less fluff!

5 Reasons Your Page Is Not Indexed On Google Search and How to Fix It

Posted by on Jul 15, 2018 in SEO Articles | Comments Off on 5 Reasons Your Page Is Not Indexed On Google Search and How to Fix It

5 Reasons Your Page Is Not Indexed On Google Search and How to Fix It

5 Reasons Your Page Is Not Indexed On Google Search and How to Fix It

After hours of coding, writing, designing and optimizing, finally, a new web page has gone live. But hey, why is it not appearing in Google Search? What have I done wrong? Why does Google hate me?

Now, now, we have all been there before. And I have learned to give it at least a day or two before trying to seek out my new blog post on Google Search. Because I have long accepted that that’s how long Google needs to actually put my newborn, I mean my new blog post on Google Search.

The process of getting a new web page or a new website on Google Search is a long and windy one. But it’s one worth learning about.

Let’s start with the basics.

What is Search? How does it works?

Here’s a video from Google, where Matt Cutts tell you a little about how search works.

Have you watched it? If yes, please bear with my little attempt at summarizing.

For a content to appear on Google Search, it has to go through spiders. No, not real spiders, but a program called spider. The spider will start with a link, it will then crawl through the content. If they see another link in the content, they will crawl it too, and the process repeats.

Crawled contents, or web pages, are then stored in Google’s index. When a user made a query, answers are pulled from the index.

So in order for your content to show on Google Search, you have to first make sure your website is crawlable, the Google crawler is called Googlebot. Then you have to make sure it’s indexed correctly by the indexer which is called caffeine. Then only will you see your content appearing on Google Search.

Here’s the thing, how do you check from Google for exactly whether or not they have indexed your content? Well, you can’t. Like all things in SEO, the next best strategy you can do is analyze and give it your best guess.

Try typing into the google search bar site:insertyourdomainhere.com and tab enter. Google Search will give you a list of all indexed web pages from your domain.

But, as Matt Cutts once said, web pages that are not crawled CAN appear on Google Search as well. Well, that’s another topic for another day.

Still interested? The video is only 4 minutes long. You can have a look if you want.

Anyways, let’s get back to topic. For me, I will give it at least a couple days, at most a week, until I start freaking out on why my content is still not appearing on Google Search.

If it has been more than a week, or even a month and your website is still not there.

Here is a list of stuff that you need to consider.

1. Have you checked your robots?

Sometimes a little-overlooked detail can have a big effect.

Robots.txt is the first place that Googlebot visits on a website in order to know which web pages are nofollow or noindex and such.

Do you have this in your HTML head section?

The robots noindex tag is handy to make sure that a certain page will not be indexed, therefore not listed on Google Search.

Commonly used when a page is still under construction, the tag should be removed when the web page is ready to go live.

However, because of its page specific nature, it comes as no surprise that the tag may be removed in one page, but not another. With the tag still applies, your page will not be indexed, therefore not appearing in the search result.

Similarly, an X-Robots-Tag HTTP header can be programmed into the HTTP response. Which can then be used as a site-wide specific alternative for the robots meta tag.

Again, with the tag applied, your page will not show up in Search. Make sure to fix them.

Read more about meta tags here: How To Control Web Crawlers With Robots.txt, Meta Robot Tags & SEOPressor

2. Are you pointing the Googlebot to a redirect chain?

Googlebot is generally a patient bot, they would go through every link they can come across and do their best to read the HTML then pass it to caffeine for indexing.

However, if you set up a long winding redirection, or the page is just unreachable, Googlebot would stop looking. They will literally stop crawling thus sabotaging any chance of your page being indexed.

Not being indexed means not being listed on Google Search.

I’m perfectly aware that 30x are useful and crucial to be implemented. However, when implemented incorrectly, that can ruin not only your SEO but also the user experience.

Another thing is to not mix 301 and 302. Is it moved permanently or moved temporarily? A confused Googlebot is not an efficient Googlebot.

Hear it from Google themselves.

So make sure that all of your pages are healthy and reachable. Fix any inefficient redirect chains to make sure they are accessible by both crawlers and users alike.

3. Have you implemented the canonical link correctly?

A canonical tag is used in the HTML header to tell Googlebot which is the preferred and canonical page in the case of duplicated content.

For example, you have a page that is translated into German. In that case, you’d want to canonical the page back to your default English version.

Every page should, by advise, have a canonical tag.

Either to link it back to itself in the case where it is a unique content. Or link it to the preferred page if it is duplicated.

Here comes the question, is the link you canonical to correct?

In the case of a canonical page and its duplicates, only the canonical page will appear on Google Search. Google uses the canonical tag as an output filter for search.

Meaning, the canonical version will be given priority in ranking.

SEOPressor Connect let you skip the step of manually inputting the canonical tag.

If that is not your purpose, fix your canonical and link it back to itself. That would do the trick.

4. Maybe you have exceeded your Crawl budget

I’m on a budget.

Google has thousands of machines to run spiders, but there are a million more websites out there waiting to be crawled.

Therefore, every spider arrives at your website with a budget, with a limit of how many resources they can spend on you. This is the crawl budget.

Here’s the thing, like mentioned before, if your websites have a lot of redirection chains, that will be unnecessarily eating your crawl budget. Because of that, your crawl budget might be gone before the crawler reaches your new page.

How to know how much is your crawl budget? In your Search Console account, there will be a crawl section where you can check your crawl stats.

Let’s say your website has 500 pages, and Googlebot is only crawling 10 pages on your site per day. That crawl budget will not be efficient enough for the new pages that you’re pumping out.

In that case, there are a few ways to optimize your crawl budget.

First of all, authoritative sites tend to be given a bigger and more frequent crawl budget. So get those backlinks. The more quality and relevant links pointing to your website mean your website IS of good quality and high relevance to your niche.

We all know building up authority doesn’t happen in one day. So another thing that you can do is to make sure that your site can be crawled efficiently.

You need to make good use of your robots.txt file. We all have some pages on our website that don’t really need to be up there in Search like duplicate content, under construction pages, dynamic URLs etc.

You can specify which crawler the instruction applies and which URL strings should not be crawled. As an example:

That way, crawlers won’t be spending unnecessarily budget on pages that don’t need crawling.

A list of the most common user agents includes:

Googlebot (Google)

Bingbot (Bing)

Slurp (Yahoo)

DuckDuckBot (DuckDuckGo)

Baiduspider (Baidu)

YandexBot (Yandex)

facebot (Facebook)

Ia_archiver (Alexa)

One important thing that I have already mentioned above that also applies in optimizing your crawl budget is to fix those redirect chain. They are not only inefficient they are also eating up your crawl budget.

If they are any pages returning with a 40x errors, fix those too.

5. Is your page actually an orphan?

An orphan page is a page that has no internal links. Perhaps the link is faulty causing the page to be unreachable, or during a website migration, the link is accidentally removed.

Remember how the spiders work? They start from one URL and from there they crawl to other URLs that are linked.

An orphan page can’t be crawled because there is no way to be crawled. It is not linked from your website, thus the term orphan. That’s why interlinking is so important because it acts as a bridge for the crawlers from one page of your content to another.

Read more about interlinking here: Why Internal Links Matter To Your SEO Effort?

How can you identify orphan pages?

If you’re like us and you’re using the WordPress CMS you can export a full list of URLs of every pages and content on your website. Use that to compare with the unique URLs found in a site crawl.

Or you can look up on your server’s log file for the list of unique URLs loaded for let’s say the last 3 months. Again, compare that with the list you got from the site crawl.

To make your life easier, you can load those data into an excel file and compare them. The URLs which are not duplicated are the ones that are orphaned.

After knowing what are the orphaned pages, fixing them would be much easier.

Now what you need to do is link those orphan pages appropriately. Make it so they are easily discoverable by users and crawlers alike. Also, don’t forget to update your XML sitemap.

Spidey please do your job instead of laying around getting tied on a railway.

If everything is working nicely, there are no error codes returning, the robots tags are fine, but your page is still not showing up. Why? Well, the issue might very well be from Google’s side. Maybe you are just not the crawler’s priority.

Google only has so many resources. And the bigger and more authoritative websites will be allocated a bigger and more frequent crawl budget.

Here are 5 steps you can take to urge Googlebot to crawl your new pages faster.
1. Update your sitemap in search console

Submitting your sitemap to Google is like telling them “Hey, here check out these important URLs from my website and crawl them!”.

They might not start crawling those URLs immediately but at least you gave them a heads-up.

If you run a huge site that updates constantly. Keeping up with your robots and multiple sitemaps will probably drive you nuts.

Keep in mind though you can’t have noindex, nofollow on your robots and then adding it to your sitemap. So do you want it indexed or not?

To avoid things like to happen, maintaining a dynamic ASPX sitemap will probably be your best choice.

2. Submit your URL directly to Google for indexing

Similarly, Google lets you manually submit your new page to them.

It’s really simple. Just search for submit URL to Google and the search result will return you with an input bar. Now, copy and paste the URL of your new page and click on the submit button.

Voila, you have done submitting a new URL to Google for crawling and indexing.

However, just like the sitemap, this only acts as a heads-up for Google to make them aware of the existence of your new page.

Just do it anyways when your web page has been sitting there for a month and still not being indexed. Doing something is better than nothing right?

3. Use fetch as Google in Search Console

You can request, directly to Google, for a re-crawl and re-index of your page.

That can be done by logging into your Search Console then perform a fetch request via fetch as Google. After making sure that the fetched page appears correctly: all the pictures are loaded, there are no broken scripts etc. You can request for indexing, then choose between the option of crawling only this single URL or any other URLs that are directly linked.

Again, Google warned that the request will not be granted immediately. It can still take up to days or a week for the request to be completed.

But hey, taking an initiative is better than sit and wait right?

4. Get high domain authority

Government websites is one of the example of a high authority website.

Once again, domain authority affects how frequent and how much your crawl budget will be.

If you want your new pages and website changes to be indexed swiftly, you have a better chance if your page rank is high enough.

This is a matter of slow and steady win the race though. If you can get a million backlinks based on one single content in a single day, that’s great.

But one great content is not enough. Your website needs to be updated frequently and consistently with quality content while simultaneously gain quality backlinks for your page authority to go up.

Start updating your website at least twice weekly, reach out to the community to build brand awareness and connections.

Keep that effort up, slowly and steadily your authority will go up and your website will be crawled and indexed much faster.

5. Have a fast page load speed

Here’s the thing, when you have a website that loads fast, Googlebot can, therefore, crawl it faster.

In the unfortunate case where the load speed of your website is not satisfying and requests frequently time out, you’re really just wasting your crawl budget.

If the problem stems from your hosting service you should probably change to a better one. On the other hand, if the problem comes from your website structure itself, you might need to consider cleaning up some codes. Or better yet, make sure it is well optimized.

Read more about page speed and SEO here: The Connection Between Site Speed and SEO Today

#optin-template-3{
float: left;
margin: 0;
width: 100%;
max-width: 654px;
height: 100%;
}
#optin-template-3 .container{
float: left;
width: 100%;
height: 100%;
text-align: center;
background: #fff;
padding-bottom: 16px;
}
#optin-template-3 .top-row{
display: inline-block;
width: 88%;
padding: 3% 6% 0%;
}
#optin-template-3 .top-row h2{
margin: 5px 0 0;
font-family: “roboto”, helvetica, sans-serif;
color: #6fc6bf;
font-weight: 600;
text-align: center;
padding:0px 0px 5px;
font-size:2.2em;
}
#optin-template-3 .left-column{
display: inline-block;
width: 100%;
max-width: 270px;
min-width: 270px;
height: 100%;
vertical-align: top;
padding-top: 32px;
}
#optin-template-3 .ebook-img{
width: 100%;
min-width:270px;
height: 280px;
background: url(https://seopressor.com/wp-content/uploads/2016/09/Page-Speed.png);
background-size: contain;
background-repeat: no-repeat;
}
#optin-template-3 .right-column{
display: inline-block;
width: 60%;
min-width: 250px;
max-width: 305px;
padding: 24px 4% 32px;
}
#optin-template-3 .bodycopy ul{
text-align: left;
padding-left: 0;
}
#optin-template-3 .bodycopy ul li{
font-family: “roboto”, helvetica, sans-serif;
margin-left: 20px;
}
#optin-template-3 .optIn-form{
display: block;
bottom: 0;
}
#optin-template-3 .email{
display: block;
width: 100%;
border: 0;
padding: 8px 0;
font-size: 18px;
text-align: center;
border: 1px solid #6fc6bf;
}
#optin-template-3 .submit-button{
display: block;
margin-top: 4%;
width: 100%;
padding: 8px 0;
font-family: “roboto”, helvetica, sans-serif;
font-weight: 400;
color: #fff;
background: #6fc6bf;
font-size: 21px;
border: 0;
outline: 1px solid #6fc6bf;
cursor: pointer;
}
#optin-template-3 .container .align-justify { text-align:justify !important;}

Top 10 Best Page Speed Tools

A comprehensive list of page speed tools to provide a better user experience.
Identify the top 10 page speed tools that influencers are recommending
Get your visitors to stay longer on your website with these simple tools
Find more benefits of every tool to reduce bounce rate.

Top 10 Best Page Speed Tools

What to consider when selecting marketing channels

Posted by on Jul 15, 2018 in SEO Articles | Comments Off on What to consider when selecting marketing channels

Selecting the marketing channels to use is a complex decision – there’s no one-size-fits-all formula for distributing your product. Whether you’re just getting started or you’ve been in business for a while, it can be difficult to find effective ways to reach your target market.

One way to make the process simpler is to break down all the factors that play into channel selection. Your customer base, your available resources, and your product itself can all help guide your decision. Take the following factors into consideration as you weigh your options, and you’ll have an easier time choosing the right marketing channels for your business.

What are the physical attributes of your product?

Sometimes the physical attributes of a product dictate how it should be distributed. Not everything can be easily shipped, and some items need to be handled more carefully than others.

If your product is large and heavy, for instance, shipping it across the country may not be practical. And if you sell something perishable – like food or cosmetics – you’ll probably want to get that product into customers’ hands as quickly as possible. In cases like these, it’s best to look for a short marketing channel.

However, if your product is durable and easy to ship, you have more options. In this case, a longer distribution channel with more middlemen may give you certain advantages, like a wider distribution area.

What kind of brand image do you want to create?

Your brand’s overall image is shaped by your customers’ buying experience, from start to finish. Where and how a person buys your product is just as important as the quality of the product itself. As you consider your options for distribution channels, ask which support the kind of brand image you want to cultivate.

For instance, if you want people to associate your brand with uniqueness or exclusivity, you probably wouldn’t want to sell your products at Walmart, even if that meant reaching more customers. Rather, you’d probably want to target more exclusive retailers, or even focus on distributing your product online yourself.

How technical is your product?

The more specialized or difficult your product is to use, the more you’ll benefit from using short marketing channels or selling directly to customers. That’s because people are often reluctant to take a risk on an ‘intimidating’ product unless they’ve built up some trust with the business first. Leads must feel assured that you’ll help them with setup and provide tech support if something goes wrong. For example, if you sell specialized software or complex machinery, you’ll probably want to focus on choosing leads carefully and building relationships with them – not distributing your product as far and wide as possible.

Are you selling to individuals or businesses?

Business to business selling requires a different approach than business to consumer selling. If you’re selling to individuals, retail may be a good option for you, since most B2C businesses don’t need to build personal relationships with customers.

However, if you have a B2B business, retail is out of the picture. It’s too impersonal, and it won’t put your product in front of the right customers when they need it. Direct selling or selling through an agent will likely be your best bet.

How big and geographically diverse is your target market?

Where do you want to sell your product, and how many people do you expect to buy it? Look for marketing channels that can accommodate both the area you want to cover and the volume of customers you’re anticipating. For a small, local business, this could mean setting up your own store or selling your product door-to-door. If you want to reach a wider market, the internet is a good option that’s accessible even to small, new businesses.

Where and how does your target market like to shop?

Do some market research and figure out how your target audience prefers to shop. Do they visit retail stores? Do they place bulk orders online? Are they inclined to make impulse purchases, or do they research products carefully before making a decision? Knowing your market’s shopping habits will make it easier to position your product where buyers can find it.

How much time and effort can you spend on distribution?

Distributing a product takes a lot of resources and organization. Handing the product off to a middleman makes the process easier for many businesses. However, if you have the resources to do your own distribution through direct selling or an ecommerce site, you retain control over how your product gets into customers’ hands. You might also make more profits in the long run.

Which marketing channels do your competitors use?

It’s important to know how your competitors sell their products to customers. If you aren’t sure which channels your competitors are using, do a little research to find out.

You can use this information in a couple of ways. The first way is to adopt the same marketing channels your competitors use, or find very similar ones (and this includes social media). This strategy can work well because you know that your competitors’ channels have a built-in market for the types of products you sell.

Another approach is to avoid your competitors’ marketing channels entirely. Instead, look for different channels where your rivals have no reach, and sell there. This can be a very effective way to cut down on competition. However, it can be hard to find marketing channels that are both effective and untapped in your field. If you’re good at thinking outside the box and you have the resources to do plenty of promotion, this strategy might be worth a try.

Which channels offer you the most advantages?

Some marketing channels will offer you more advantageous partnerships than others. Make a list of the channels you’re considering, and ask yourself the following questions:

Will certain middlemen promote your product more than others?
How will your choice of middlemen affect your bottom line? How can you maximize your profits?
Do any channels have particularly favorable or unfavorable policies? For instance, if a potential partner wants the exclusive rights to distribute your product, that might not be a good deal for you
What kind of reputation does each channel have? Is their business financially sound? Are they known for being reliable and pleasant to work with?

The take-away

There are a lot of moving parts to consider as you select marketing channels for your product. Deciding doesn’t have to be overwhelming, though. Weigh these nine important factors, both on their own and in relation to each other, as you consider your options. By putting plenty of thought and analysis into your decision, you’ll give yourself the best possible odds of selecting marketing channels that benefit your business for years to come.

Amanda DiSilvestro is a writer for No Risk SEO, an all-in-one reporting platform for agencies. You can connect with Amanda on Twitter and LinkedIn, or check out her content services at amandadisilvestro.com.

 

Why Internal Links Are as Powerful as External Links with Dawn Anderson

Posted by on Jul 15, 2018 in SEO Articles | Comments Off on Why Internal Links Are as Powerful as External Links with Dawn Anderson

Why Internal Links Are as Powerful as External Links with Dawn Anderson

Enjoy a brand new cognitiveSEO Talks – On Search and Traffic with Dawn Anderson, an all-in-one professional that will make you take out a pen and a sheet and note everything she has to say.

 

There’s a lot to say about Dawn. She is a skillful digital marketer and SEO consultant with over 11 years of experience, Lecturer at Manchester University. She is also a speaker and trainer, contributing in international conferences and she also founded Move It Marketing Agency back in 2012.

 

 

 

As Dawn herself mentioned in the talk above, SEO is about constantly digging and looking for secrets. And talking about secrets, let me tell you something about Dawn; although she seems that she’s been doing SEO& digital marketing since forever, Dawn had a previous career in a totally different domain: she managed a building maintenance company. Maybe one of the greatest things about the search industry is that people come from very diverse backgrounds.

 

There are so many SEO elements you need to keep an eye on, so many ranking factors; we can never guess exactly what they are and this is why every year we end up with more Google ranking factors to consider.

DAWN Anderson

International SEO consultant & Digital Marketing Lecturer  @dawnieando / Move It

 

Aside from being a great SEO consultant, the director of Move it Marketing is also a big dog lover. And if you listen to the interview carefully, you’ll be hearing her cute pomeranian joining the conversation. 

 

We could list dozens of reasons why one should listen to this cognitiveSEO talk episode with Dawn. Yet, we wouldn’t want to spoil you the joy of discovering an engaging conversation sprinkled with insights and words of wisdom.  

 

The key is always to try to make great sites and to develop your technical skills as much as you possibly can.

DAWN Anderson

International SEO consultant & Digital Marketing Lecturer  @dawnieando / Move It

 

 Tackled Topics: 

 

How Dawn got into SEO and search marketing
The importance of testing in SEO
Duplicate content issues for eCommerce sites
Crawl budget and how it impacts a site
Best use cases for PPC and SEO
The impact of voice search in SEO
The Google mobile speed update and its impact

 

  Top 10 Marketing Nuggets:  

 

Make sure you have some small test sites of your own even if you’re brand new in the industry. 2:06
The key is to try to make great sites and to develop your technical skills as much as you possibly can. 10:02
Internal links are for me often as powerful as external links. 11:35
Crawl budget on massive sites can impact rankings. 11:50
Technical issues are like a slow painful death by a thousand cuts on a big website. 14:26
[For e-commerce websites] Add photos, reviews, videos, linking content, xml sitemaps as it’s another way of enhancing your overall website with another layer of data Google does have access to. 23:34
We don’t know if links are being ignored by search engines. It’s a play in the dark. 29:45
Voice Search will massively affect SEO. Maybe not today but in the next couple of years, we’re going to see that. 31:01
No one person in this industry has all the answers. Be critical, read widely, test widely; make sure to try learning all the time. 36:12
If you have slow loading sites you should be looking to address it; one of the biggest thing that you can do is to optimize images. 41:39

The post Why Internal Links Are as Powerful as External Links with Dawn Anderson appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

How to organize your keyword lists

Posted by on Jul 15, 2018 in SEO Articles | Comments Off on How to organize your keyword lists

How to organize your keyword lists

Keyword research is a fundamental tactic that I have seen completely transform the overall marketing strategies of those who take it seriously.

In fact, just about any marketing area begins with keyword research, be it competitive analysis, traffic growth, content planning, or PPC strategy. It has always been the foundation of online marketing and it still is – even though it’s rapidly evolving.

I have seen clients go from barely functioning marketing plans to full-scale content marketing projects that up their rankings and conversions. Keywords are serious weapons.

Why organize keywords?

Keyword lists are messy. They contain every little variation of each particular query because they include whatever enough people spontaneously type into the search box.

We search in a more disorganized way than we speak. For example, we could search ‘research keywords’, ‘how to research keywords’, ‘research keywords how to’, ‘keyword research tips’, or even ‘keyword research how to tips’ – and all of that will basically mean the same thing (i.e. we want to know how to research keywords).

Keyword research tools like SEMrush and Ahrefs will provide you with hundreds of thousands of those keyword strings (as well as marketing inspiration).

But, how do you make sense of all those lists that leave us with a huge pile of dumped keywords organized with no rhyme or reason? How do you turn them into plans and actions?

This is precisely why you should be taking the time to organize your keywords. It might not be a very fun process, but it is a very important one.

Here are some tips.

Usefulness and value

One popular way to organize your keywords is by usefulness of the keyword. How you define that is up to you, but many marketers categorize it by price per click balanced with the projected click rate. They also look at how likely it is that the keyword would help them rank on the first page or (more recently) get a featured snippet:

Top: the absolute best and most expensive keywords that you might try and target in the future, depending on time and budget, as well as how useful the end result would be in light of those factors
Moderate: middle ground keywords that cost less than the top, but have the highest potential within that price level. This is where most of your research should lead you and the largest portion of your spreadsheet is going to be dedicated to these
Bottom: the cheapest keywords to target aren’t worth much when it comes to primary keyphrases. However, you may want keep an eye on them anyway and sometimes look to them either for inspiration on future phrases (to expand on), or as secondary/tertiary phrases for projects that require them.

Featured tool: I like using Ahrefs “clicks” data to determine most useful phrases, i.e. phrases that are able to send a lot of traffic, and those where my site ranks pretty well already:

[No other keyword research tool beats this insight.]

Relevancy

The abundance of keyword strings in your lists often mean pretty much the same thing. They are always in your way preventing you from focusing on other important aspects of keyword research, so getting rid of those (or rather grouping them) is the first thing to do.

This is where keyword clustering comes in handy. I have already explained the tactic in detail here.

Featured tool: Serpstat looks at Google SERPs for each phrase and determines related queries by overlapping URLs. This is pretty much the only tool that can do that, to the best of my knowledge:

Search intent

Another way to organize keywords is by intent, which is usually more straightforward. Set some goals about what you want to accomplish – not just with keyword research but your whole brand. Use that to inform your keyword strategy and separate each goal by intent so you have a list of keywords for each.

Say that you want to target the market for affordable time management programs. You will want to increase brand visibility, get a featured snippet in a popular query and bring more attention to your social media. Make keyword lists for those three goals.

Usually search intent puts keywords into four groups:

Commercial ‘high intent’ intent: these users are ready to buy now
Informational intent: these users are willing to read, not ready to buy but may opt-in and stick around for a bit longer
Transactional intent: these users can be both (researching, then buying)
Navigational intent: these users are interested in a specific brand. Depending on whether that’s your brand or someone else’s, you may want to turn them into believers or snatch them for the competitor.

Featured tool: searching Google itself will give you some idea on what Google has found the intent to be. For example, if you see shopping results, you can be fairly sure Google has come to the conclusion that most of these searchers wanted to buy things.

[Chart source: Digitaleagles.]

Brand-focused queries

These should be a separate tab in your spreadsheet. Every company needs to make it easier for people to find you. Do this based on your brand name, [competitor alternatives], etc., which is an easy way to make sure your bases are covered and a simple way to organize your research.

Another way to do this is to target phrases that are negative and then prove them wrong with content. An example would be a phrase like, “Is [product name] a scam?” When users search it, they will find that no, you are not a scam and are not listed on any scam sites. This reassures them, even though the original search was negative.

Don’t forget to research all kinds of queries your (or your competitors’) brand includes:

[You may also want to label these queries by sentiment to give your content team more clues on how to address each one.]

By modifier

I always do these in their own list. A modifying keyword is one that uses an adjective to describe what is being searched for. For instance, they may search for ‘cheap project management platform’ or ‘free ways to manage teams’.

Words like free, cheap, top, best, etc., are fantastic modifiers and are easy to organize in their own section. Once you have had some trial and error you will know which work best.

Organizing by modifiers helps you evaluate your niche trend to match your content and conversion funnel strategy. Do your potential customers tend to search for cheap or exclusive types? Are they looking for DIY or pre-built solutions? Organizing by modifiers gives all those important answers.

I wrote about this type of keyword organizing in an older article at Moz:

Use a template that includes all relevant information

Finally, make sure you are using as much information as possible. Add volume/clicks, difficulty and anything else you can think to use. You may also consider adding labels for which type of action each keyword requires:

Optimize old content
Create new page.

As well as page type it’s good for, e.g.:

Product page
Product list
Blog post
Video, etc.

There may be more labels if you are optimizing a local business website. Michael Gray described some in his article here.

That information should also include how it is working over time. I have made graphs with Excel using the data and gotten a much clearer picture of what is and isn’t bringing in the results I want. You can tweak from there.

Do you have a tip for organizing keywords? Let us know in the comments.

Yoast SEO 7.8: Synonyms and keyword distribution

Posted by on Jul 15, 2018 in SEO Articles | Comments Off on Yoast SEO 7.8: Synonyms and keyword distribution

Yoast SEO 7.8: Synonyms and keyword distribution

Semantics is hard. What does a certain word mean in a specific situation? Which ‘mars’ are you talking about? Have you ever tried to discover all definitions of ‘run’? In most cases, context is everything. You can help humans and machines understand a text better by adding context. This is one of the reasons Yoast SEO is now adding support for synonyms and related keywords, giving you more flexibility to improve your text! Now available for Premium users of Yoast SEO 7.8.

Optimize your site for search & social media and keep it optimized with Yoast SEO Premium »

$89 – Buy now » Info

New Premium feature: Synonyms

Content SEO has long been about finding out what your main keyword was and adding that focus keyword in a couple of places in your text. While that worked pretty well, there’s a lot more going on at the moment. Not only is search intent more important than ever, but search engines get smarter and smarter every day. They increasingly ‘know’ what a text is about by looking at the context in which these focus keywords appear. This context is what makes or breaks a text.

Yoast SEO always worked by a single focus keyword or multiple focus keywords in our Premium plugin. We understand this can be a bit restrictive; we’re not even looking at plural instances of the keyword. Luckily, that’s about to change!

We’re working on some very nice new language-based SEO checks, and we’re presenting the first updates today: synonyms and keyword distribution! Yes, you read that right: Premium users can now add synonyms and related terms to check. Writing about bikes? Your synonyms will probably include ‘bicycle, cycle, ride, two-wheeler,’ and now you can add those terms. The Yoast SEO plugin will check how you use these terms in your article.

New Premium feature: keyword distribution

The new synonym feature also works in conjunction with another new feature in Yoast SEO Premium: keyword distribution. If you added a couple of synonyms for your focus keyword, Yoast SEO now checks to see if these are distributed well throughout the text. Before you could add your focus keyword in the intro a couple of times and that would be fine by us. That’s over. We’re taking the complete text in regard and want you to evenly and realistically distribute your focus keyword and synonyms. The gif below shows what the highlighting of keywords and synonyms looks like.

We keep using the focus keyword exclusively to determine keyword density. In our opinion, optimizing your post for the most common keyword — the one that your keyword research uncovered as being most used by your audience — continues to be imperative. 

More on the way

This is just the start. At the moment, we’re hard at work to improve the language capabilities of Yoast SEO. Marieke wrote a post describing what you can expect from Yoast SEO in the coming months. Read about morphology, related keywords and the upcoming recalibration of the SEO analyses in Yoast SEO.

Feedback welcome!

We’ve added these new checks for you to try out. We’re very much looking forward to your feedback. How are you using synonyms and related topics in your texts? What do you want Yoast SEO to do with your synonyms? Are there ways to improve how we handle the analyses of your text? As we’ve said, this is the first step to a Yoast SEO that is far more capable of understanding language and using that knowledge to provide you with the best possible feedback. Help us get there! You can either add an issue to GitHub or comment on this post. We’re looking forward to your help!

Language improvements for French, Spanish and Italian

Yoast SEO 7.8 has turned out to be a release focused on language because we’ve also expanded the language functionality for French, Spanish and Italian. Users writing French and Spanish can now use the Flesch Reading Ease assessment to check the perceived difficulty of their texts. Users writing Italian can now improve their texts using the new passive voice assessment. French, Spanish and Italian now fully support all Yoast SEO features.

Other improvements and fixes

As always, we’ve fixed loads of bugs and improved various parts of the plugin. For instance, we’ve improved the way we determine the OpenGraph for front pages, especially in the case of static front pages. We’ve also fixed several bugs regarding the look and feel of the new snippet variables that we introduced in Yoast SEO 7.7.

Update now to Yoast SEO 7.8

Yoast SEO 7.8 is an exciting new release, one that marks a new direction for us. We’re giving you much more flexibility to enhance your articles by using synonyms and providing you with more tools to determine how well you present your keywords. This is the first step to an even more relevant, useful and indispensable Yoast SEO!

Read more: Why every website needs Yoast SEO »

The post Yoast SEO 7.8: Synonyms and keyword distribution appeared first on Yoast.

How Google Hummingbird and Knowledge Graph Shaped Search

Posted by on Jul 14, 2018 in SEO Articles | Comments Off on How Google Hummingbird and Knowledge Graph Shaped Search

How Google Hummingbird and Knowledge Graph Shaped Search

How Google Hummingbird and Knowledge Graph Shaped Search

Google started as a search engine

”To organize the world’s information and make it universally accessible and useful.”

That is the goal of Google as a company. Despite their venture into other technology. From producing smartphones to being the major digital advertisement provider. Google’s root stems from search, while search stems from data and user intent.

96% of Google’s income comes from ads. Around 70% from Adwords and the rest from Adsense. Google took advantage of their position as a search engine and offer advertisement spots following their intricate algorithm.

This is possible because of their huge reach of audience and subsequently the data they can collect.

Now, how does search works?

It starts from crawling to rendering those that needs rendering, then indexing. When a user tapped enter for a query, the multitudes of algorithms powered by a vast number of machines process and filter their library of indexed data to present the user with a list. A list of what they deemed the user wants.

Google is now equivalent to the verb search. Do you say “let me look that up” or do you say ”let me Google that” or perhaps you say “hey Google what is…” ? You and I and them, we all relied on Google when we want information, when we need an answer.

Fortunately, (or unfortunately for Google) there are still countries out there where Google is not the preferred search engine giant. You have Baidu in China Mainland where Google tried to set their foot in and later backtracked. Or Yandex in Russia which is the fifth largest search engine in the world. While Naver and Daum dominate the Korean internet.

Despite the dominance of Google there are still some search engines that are preferred in their home country. Like Yandex for Russia.

Despite the little setbacks in localization Google still has their iron grip in a lot of countries.

To better server search queries Google need data, a lot of data

Why does that concern us? Well, that concerns every user of the internet because it means Google has access to millions and billions of data. From all around the world in multiple languages and different topics.

Google is building a massive and perhaps the biggest virtual library. They achieve that by actively curating contents all around the world wide web. They are actively getting their hands on more contents and more contents processing technologies.

They don’t only want to collect contents and data they are also trying to make sense of it. That’s why they acquired semantic search company like Metaweb, Natural Language Processing technology company Wavii. They are also tracking user behaviors on platforms like Gmail, YouTube, even your search history to your browsing behavior.

They collect data like dragons collect gems. Why does Google need those data for? Why do dragons collect gems for? Perhaps greed, yes, it may be greed. But they are doing it in the name of understanding user intent.

Google collect data like a dragon collect gems.

Millions of pieces of data that is connected to every single keyword people frequently search are used to decipher the intent behind a query, based on their access to the large pool of readily available contents.

The data pool serves as a base for their Search service, not forgetting ads, and who knows what’s more they’re using the data for? They’re just not telling you explicitly.

All of these are done to help them better analyze and pinpoint user intent. Google wants to know what exactly it is that users want. Then serve it to them on a silver plate the moment a query is made on Google search. They want to please the users. Happy users, happier Google.

How Google uses Hummingbird and Knowledge Graph to better understand search

Are you aware that Google signed a contract with Pentagon for Project Maven? The IT giant is assisting the weaponization of a drone by providing them the technology for machine self-learning image identification.

Do you see how cutting-edge Google’s AI technology is that even Pentagon is enlisting them help?

Now, how does it concerns search? You see, as I have mentioned before, search deals with two things. Data and human intent. In order to understand data the way that matches the human intent, the machines need to mimic the human thought process.

And one thing that machines have a hard time understanding is the complexity and sublimity of language.

The Hummingbird update released back in 2013, which was the biggest update for the search engine since 2001, created a major revamp on how search works. By utilizing and incorporating semantics theory in their algorithms, they opened a new path to the world of search.

Google Hummingbird update in a glance.

This new path leads to Google understanding user intent much more fluently. Search is now one more step closer to asking Google a question instead of inputting keywords that you wish will give you the result you want.

While knowledge graph has been in the game for a bit longer than the Hummingbird update, it seems like the understanding of actually WHAT knowledge graph is had always been a little muddy.

It can be the little knowledge graph card that appears as a Search Engine Result Page (SERP) dynamic features on the right. It gives users a quick summary of the searched term. The information used for a knowledge graph card is usually pulled from multiple uncredited sources.

A Google Knowledge Graph Card gives users instant answer without needing to leave the result page.

But knowledge graph can also be a pool of data. It is essentially a database that collects millions of pieces of data about keywords people frequently search for on the World Wide Web and the intent behind those keywords, based on the already available content.

Google always stresses on user experience and looks for ways to help build what they think will be beneficial to the users. From pushing the AMP project for a faster mobile browsing experience to the better ads standard where non-intrusive ads are celebrated.

How can they make sure that they are serving the users right? By getting more data and making sure they’re analyzing the data right.

Knowledge graph “… taps into the collective intelligence of the web and understand the world a bit more like people do.”

Knowledge graph is Google’s database while the Hummingbird algorithm is the brain trying to make sense out of it.

“Wherever we can get our hands on structured data, we add it,” Singhal said, from an interview with Search Engine Land about knowledge graph.

How Hummingbird and Knowlege Graph changed and shaped search – it is the start of semantic search

Search used to work based on keywords and the literal meaning of keywords. A string of words doesn’t mean as it does for human and the machines. Human draws a relationship between the words and forms a complete meaning naturally. While machine draws out the literal meaning of each individual words without the relation process.

That’s not how we as human works.

Human mind subconsciously stores information which later acts as clues to a new piece of information to tie them into a relationship. We’re constantly putting together puzzle pieces to build a bigger picture. THAT is what Google wants to mimic.

Data collected and indexed are the information while tracked human behavior helps build context behind a query and finally, the algorithm is the webs trying to link them all together using a hint which is the word typed into the query box to make sense of user intent and give them what Google deems most fitting.

Google thirst for data because it’s the source of their knowledge. The only way to better serve the users is to understand them. Understand them as close as to how human understands each other.

With semantic search comes entity

Semantics search introduced a new way of data analyzing. Which involves semantics theory and most likely uses a semantic knowledge model.

What is a semantic knowledge model? Simply put, a semantic knowledge model describes what each data means and see where it fits among others. Therefore helping to pinpoint a relationship among different pieces of data.

The pieces of data are called entities. An entity is a thing, be it physical or conceptual. Each entity has attributes and characteristics which are then matched and related to those of other entities to make sense of their relationship.

This is required because most words contain multiple meaning which all mean different things when inserted into a different set of words depending on the context.

Like how human naturally makes sense of a set of words grouped together, the semantic knowledge model helps the search engine makes sense, in a human perceived way, of a set of words typed into the search bar.

The model is also used when Google tries to decide which content to pull from their library to return a search query.

In fact, other than a “normal” indexed database, Google has a separate database that is called related entities index database.

An entity is not simply a word or a thing anymore. An entity is understood and defined by its range of characteristics and attributes. More inclined to how a human perceives a word as opposed of how a thesaurus would.

“You shall know a word by the company it keeps.” (Firth, J. R. 1957:11)

Let’s look at it this way, an entity is when a word is understood including the common sense that surrounds it.

A cheery is a fruit, it’s sweet, it has a seed. These are all basic information that both computer and human knows. But one thing that the computer might not know but the human will know is the story where George Washington chopped down a cherry tree (or not, apparently it’s a myth).

Human process information in a much complex way compared to machines.

And that’s what sets a good search engine from just a search engine apart. The ability to understand a word as a complex entity instead of just its dictionary definition.

Suddenly how a content can manage to rank when the keyword is mentioned minimally, or the text is actually long and tedious to read, and not much pictures are sprinkled in makes sense.

On page optimization can only do so much when that is not what Google is primary looking at anymore.

What that means to marketers

Other than the widely known keyword density and backlinks, related entities are now another factor to consider.

Which means optimizing Latent Semantic Indexing (LSI) keywords is now more relevant than ever.

LSI keywords in the sense of information retrieval utilize Latent Semantic Analysis (LSA) to produce a set of themes, or in this case a set of keywords, that matches the document by analyzing the words mentioned. The result is a set of conceptually similar keywords.

The distributional hypothesis assumes that a set of LSI keywords will appear naturally in a document or across multiple similar themed documents. These keywords or entities are one of the factors that Google possibly used to determine whether a content is relevant or not.

Google confirms that they track related entities across the web with the existence of a related entities database. Which means pages of contents across the web is now also linked with related entities.

Google has both an index database and related entities index database.

What content creators and content marketers can take away from this is: keep your friends close and your enemies closer.

When you’re thinking about a topic to extend on, look at what your competitors have written. Cause they will be your related entities. The contents that they have poured out is a part of what Google makes of the topic.

Now is also time for you to take search intent seriously. Google has pools of data and multiple ways they can use to track user behavior to come up with what user really wants. We should do the same.

Will Google instant answers take away traffic from marketer?

“Additionally, questions that users will likely have about an entity after submitting a query directed to the entity can be predicted and information about entities that are answers to those questions can be provided to the user as part of a response to the query…”

Webmasters and content writers alike are worried that with the SERP features popping up aiming to provide users with instant answers, their click-through rate will suffer.

Well according to HubSpot, the contents showcased in featured snippet which occupies rank zero actually has around double the click-through rate compared to the normal blue links results.

People are concerned that one day Google will be self-sufficient with their knowledge graph and content creators are no longer needed.

Well, I think unless the day a machine can articulate their own original opinion on a topic instead of act as a curator of information, content writers will not die out.

Why? Let’s take another look at Google’s mission.

They want “organized information that is useful and accessible”

Is Google organizing information? Only when the creator gives them clue on what a piece of content is.

Can Google generate information without any human input? No.

Who is the one who deems something useful or not? Human.

How can information be accessible? On digital networks created by human.

Technology is born from the collected wisdom of human aimed as a tool to serve humans. They are the middleman of information transmission from one human to another. So no, Google will not take over the world. Not in the sense of content creation anyways.

SEO strategy for the age of semantic search

Let’s plan out a strategy using Google’s mission as a guideline.

1. Make sure that your website appears organized to both crawlers and users

Structured data is what helps Google identify the entities present in your content.
Try to use metadata and markups wherever you can to communicate with the crawlers and indexers.

Readers might immediately understand that Homer Simpsons is a name. But without proper markup, the search engine will not get it.

Schema acts like a tag to tell the search engine which part of your content means what.

It’s a standard and widely used form of microdata born out of a collaborative effort between Google, Yahoo, Bing, and Yandex. When you see all these big names lined up it means you better pay them some attention.

There is an extensive list of item types that you can use. Below are some useful and more general markups that may be of use to you (just so you don’t need to go through the library yourself).

What is it?

http://schema.org/Place
http://schema.org/Person
http://schema.org/Corporation
http://schema.org/Product

Do you have additional information about it?

http://schema.org/gender
http://schema.org/birthDate
http://schema.org/address
http://schema.org/email
http://schema.org/jobTitle

What kind of content is this?

http://schema.org/Blog
http://schema.org/Article
http://schema.org/HowTo
http://schema.org/Review
http://schema.org/SoftwareApplication

What is the first factor that makes you decide to stay, read the article or leave the page altogether?

Well for me it’s how easy to read it is. Blocks of text can be slightly intimidating and boring. While too many pictures can be distracting. If there is a huge block of ads following me whenever I scroll down the page I usually just give up.

Be organized not only for the machine but for the readers too. Make sure you structure your contents clearly and in an expected manner.

Are your contents in short paragraphs?
Are you using bullets, boxes and numbering your points wherever employable?
Are you featuring related pictures to demonstrate your points?
Are there any intrusive ads that disrupt the reading experience?
Are the contents segmented according to theme?
Does your font contrast nicely for read with the background colour?
Is your font big enough to be read comfortably?

2. Create usel contents that answer to search intent

I think useful should be stated more specifically as useful for the users. In content creators words, the piece of content should match the search intent.

There are a few ways to achieve this. First and foremost, the golden rule of SEO is – do your keyword research.

I know I just stressed the importance of entities but don’t get confused cause a keyword is an entity too.

Find out what your targeted users are searching for. What troubles them? What excites them? What kind of information do they want to find out? In order to do that, you need to collect data.

Use the keyword planner in Google AdWords.

It will show you the search volume of your targeted keywords and also forecast how it will perform. All of those data come from, of course, yours truly, Google.

Use keywordtool.io to get more keyword ideas.

This fremium tool can be a good place for your long tail keyword ideas. Although it lacks the search volume feature, the different site it targets like Google, Youtube and Bing gives you a bigger variety of keyword candidates.

Use LSIGraph to up your relevance game

This LSI keyword tool gives you a list of thematically related keywords to your targeted keyword. See how much you can hit or use it as a guide to see how relevant your piece is in the eyes of a machine.

You can gauge your content’s performance by looking at the user experience data.

Is the bounce rate too high? Or maybe the time on page is too short. All of that can help you build a picture of how relevant a topic is for the users.

Google Analytics is a simple place to start.

Make revisions to your pieces if they have the potential but the data just doesn’t match up.

People nowadays want everything fast, fast cars, fast internet speed, fast information.

That’s why during the creation process strive to be able to present the most information in the most direct way. Google loves how-tos and lists. You can often see those being placed as a featured snippet.

how tos
infographics
tables
things that can give them the most information in a glance

Of course, there are those at the other end of the spectrum too. There are people who would read through lengthy pieces. That kind of people wants in-depth information.

studies
reviews
insights
analysis
manual

Give your audience a healthy sprinkle of both.

3.Make sure your website is accessible to crawlers and users alike

Accessible comes in many ways and forms. In the load speed of your page to how many shares you have.

Accessible doesn’t just mean putting something up on the internet and hope that people will see it.

Machines need to pick it up before they can act as a bridge between you and the readers.

In the case of SEO, the first thing you need to do is give access to the crawlers. Then, of course, make sure they can actually interpret your HTML to make sense of your content.

Make sure your robots is set up correctly

To allow access to all crawlers sitewide:

User-agent: *
Disallow:

The default value to allow indexing and access to links within a page should be:

META NAME= “ROBOTS” CONTENT=”INDEX, FOLLOW”

For JavaScript websites, rendering can be a problem.

If your website is powered by JavaScript the search engine might take longer to index your page. Even though Google confirms that they have the ability to render pages themselves to properly index the JavaScript, it is not perfect.

Google recommends webmasters to implement hybrid rendering and dynamic rendering to ease the indexing process. However, it is not required.

Considering the resources needed to handle rendering, you might need to double consider before implementing any.

But using JavaScript powered website can actually raise accessibility to the users due to the service worker function and the fact that everything is dynamically fetched.

Faster load speed makes your content more accessible.

Not everyone has their internet connection via an optic fiber. 3G is still very much used and useful enough in for a big portion of the population. Which is why having a bloated website is not gonna add points to your accessibility.

Bloated HTML can also slow down your website making it more difficult to actually receive what you have to offer.

See how your load speed performs using the Google PageSpeed Insights.

Spread your contents across multiple sites.

Google search doesn’t need to be the only place people can stumble upon your content. You can take the initiative to share your content across multiple platforms.

Guest posting on other websites of your same niche can be often time helpful. For one you can reach a different audience, it also establishes your brand name. Remember the related entities we talked about? Being mentioned on another website, even without a link back to you, can be helpful in leveraging your relevance in the niche.

There are people who like pictures more than words and vice versa. Why sacrifice your outreach to both types of audiences by being only one? Convert your content into multiple formats.

You can create a stack of slideshows out of your content and share it on websites like slideshare.com and get double the outreach. Or create an infographic based on your study, that way the image itself may be able to rank higher in Google image compared to your block of text.

Quora is an awesome place to get some exposure and establish authority. See a question where your last blog post is a perfect answer? Answer the question by lifting points off your text accompanied by a link to your page.

Be creative and take the initiative to make your content shareable.

#optin-template-3{
float: left;
margin: 0;
width: 100%;
max-width: 654px;
height: 100%;
}
#optin-template-3 .container{
float: left;
width: 100%;
height: 100%;
text-align: center;
background: #fff;
border: 0px solid #399ff0;
padding-bottom: 16px;
}
#optin-template-3 .top-row{
display: inline-block;
width: 88%;
padding: 3% 6% 0%;
}
#optin-template-3 .top-row h2{
margin: 5px 0 0;
font-family: “roboto”, helvetica, sans-serif;
color:#47aa4b;
font-weight: 600;
text-align: center;
padding:0px 0px 5px;
font-size:2.2em;
}
#optin-template-3 .left-column{
display: inline-block;
width: 100%;
max-width: 270px;
min-width: 270px;
height: 100%;
vertical-align: top;
padding-top: 32px;
}
#optin-template-3 .ebook-img{
width: 100%;
min-width:270px;
height: 280px;
background: url(https://seopressor.com/wp-content/uploads/2015/08/four-search-dimensions-mock.png);
background-size: cover;
}
#optin-template-3 .right-column{
display: inline-block;
width: 60%;
min-width: 250px;
max-width: 305px;
padding: 24px 4% 32px;
}
#optin-template-3 .bodycopy ul{
text-align: left;
padding-left: 0;
}
#optin-template-3 .bodycopy ul li{
font-family: “roboto”, helvetica, sans-serif;
margin-left: 20px;
}
#optin-template-3 .optIn-form{
display: block;
bottom: 0;
}
#optin-template-3 .email{
display: block;
width: 100%;
border: 0;
padding: 8px 0;
font-size: 18px;
text-align: center;
border: 1px solid #f04f43;
}
#optin-template-3 .submit-button{
display: block;
margin-top: 4%;
width: 100%;
padding: 8px 0;
font-family: “roboto”, helvetica, sans-serif;
font-weight: 400;
color: #fff;
background: #f04f43;
font-size: 21px;
border: 0;
outline: 1px solid #f04f43;
cursor: pointer;
}

Guide To The Four Search Dimensions: Pocket Book Edition

Pros & Cons of each search dimensions.
Discover what triggers special search case.
Learn to utilize different type of results.
All these in a compact, easy to read form!

The Rules of Link Building – Whiteboard Friday

Posted by on Jul 14, 2018 in SEO Articles | Comments Off on The Rules of Link Building – Whiteboard Friday

The Rules of Link Building – Whiteboard Friday

Posted by BritneyMuller

Are you building links the right way? Or are you still subscribing to outdated practices? Britney Muller clarifies which link building tactics still matter and which are a waste of time (or downright harmful) in today’s episode of Whiteboard Friday.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Happy Friday, Moz fans! Welcome to another edition of Whiteboard Friday. Today we are going over the rules of link building. It’s no secret that links are one of the top three ranking factors in Goggle and can greatly benefit your website. But there is a little confusion around what’s okay to do as far as links and what’s not. So hopefully, this helps clear some of that up.

The Dos

All right. So what are the dos? What do you want to be doing? First and most importantly is just to…

I. Determine the value of that link. So aside from ranking potential, what kind of value will that link bring to your site? Is it potential traffic? Is it relevancy? Is it authority? Just start to weigh out your options and determine what’s really of value for your site.

II. Local listings still do very well. These local business citations are on a bunch of different platforms, and services like Moz Local or Yext can get you up and running a little bit quicker. They tend to show Google that this business is indeed located where it says it is. It has consistent business information — the name, address, phone number, you name it. But something that isn’t really talked about all that often is that some of these local listings never get indexed by Google. If you think about it, Yellowpages.com is probably populating thousands of new listings a day. Why would Google want to index all of those?

So if you’re doing business listings, an age-old thing that local SEOs have been doing for a while is create a page on your site that says where you can find us online. Link to those local listings to help Google get that indexed, and it sort of has this boomerang-like effect on your site. So hope that helps. If that’s confusing, I can clarify down below. Just wanted to include it because I think it’s important.

III. Unlinked brand mentions. One of the easiest ways you can get a link is by figuring out who is mentioning your brand or your company and not linking to it. Let’s say this article publishes about how awesome SEO companies are and they mention Moz, and they don’t link to us. That’s an easy way to reach out and say, “Hey, would you mind adding a link? It would be really helpful.”

IV. Reclaiming broken links is also a really great way to kind of get back some of your links in a short amount of time and little to no effort. What does this mean? This means that you had a link from a site that now your page currently 404s. So they were sending people to your site for a specific page that you’ve since deleted or updated somewhere else. Whatever that might be, you want to make sure that you 301 this broken link on your site so that it pushes the authority elsewhere. Definitely a great thing to do anyway.

V. HARO (Help a Reporter Out). Reporters will notify you of any questions or information they’re seeking for an article via this email service. So not only is it just good general PR, but it’s a great opportunity for you to get a link. I like to think of link building as really good PR anyway. It’s like digital PR. So this just takes it to the next level.

VI. Just be awesome. Be cool. Sponsor awesome things. I guarantee any one of you watching likely has incredible local charities or amazing nonprofits in your space that could use the sponsorship, however big or small that might be. But that also gives you an opportunity to get a link. So something to definitely consider.

VII. Ask/Outreach. There’s nothing wrong with asking. There’s nothing wrong with outreach, especially when done well. I know that link building outreach in general kind of gets a bad rap because the response rate is so painfully low. I think, on average, it’s around 4% to 7%, which is painful. But you can get that higher if you’re a little bit more strategic about it or if you outreach to people you already currently know. There’s a ton of resources available to help you do this better, so definitely check those out. We can link to some of those below.

VIII. COBC (create original badass content). We hear lots of people talk about this. When it comes to link building, it’s like, “Link building is dead. Just create great content and people will naturally link to you. It’s brilliant.” It is brilliant, but I also think that there is something to be said about having a healthy mix. There’s this idea of link building and then link earning. But there’s a really perfect sweet spot in the middle where you really do get the most bang for your buck.

The Don’ts

All right. So what not to do. The don’ts of today’s link building world are…

I. Don’t ask for specific anchor text. All of these things appear so spammy. The late Eric Ward talked about this and was a big advocate for never asking for anchor text. He said websites should be linked to however they see fit. That’s going to look more natural. Google is going to consider it to be more organic, and it will help your site in the long run. So that’s more of a suggestion. These other ones are definitely big no-no’s.

II. Don’t buy or sell links that pass PageRank. You can buy or sell links that have a no-follow attached, which attributes that this is paid-for, whether it be an advertisement or you don’t trust it. So definitely looking into those and understanding how that works.

III. Hidden links. We used to do this back in the day, the ridiculous white link on a white background. They were totally hidden, but crawlers would pick them up. Don’t do that. That’s so old and will not work anymore. Google is getting so much smarter at understanding these things.

IV. Low-quality directory links. Same with low-quality directory links. We remember those where it was just loads and loads of links and text and a random auto insurance link in there. You want to steer clear of those.

V. Site-wide links also look very spammy. Site wide being whether it’s a footer link or a top-level navigation link, you definitely don’t want to go after those. They can appear really, really spammy. Avoid those.

VI. Comment links with over-optimized anchor link text, specifically, you want to avoid. Again, it’s just like any of these others. It looks spammy. It’s not going to help you long term. Again, what’s the value of that overall? So avoid that.

VII. Abusing guest posts. You definitely don’t want to do this. You don’t want to guest post purely just for a link. However, I am still a huge advocate, as I know many others out there are, of guest posting and providing value. Whether there be a link or not, I think there is still a ton of value in guest posting. So don’t get rid of that altogether, but definitely don’t target it for potential link building opportunities.

VIII. Automated tools used to create links on all sorts of websites. ScrapeBox is an infamous one that would create the comment links on all sorts of blogs. You don’t want to do that.

IX. Link schemes, private link networks, and private blog networks. This is where you really get into trouble as well. Google will penalize or de-index you altogether. It looks so, so spammy, and you want to avoid this.

X. Link exchange. This is in the same vein as the link exchanges, where back in the day you used to submit a website to a link exchange and they wouldn’t grant you that link until you also linked to them. Super silly. This stuff does not work anymore, but there are tons of opportunities and quick wins for you to gain links naturally and more authoritatively.

So hopefully, this helps clear up some of the confusion. One question I would love to ask all of you is: To disavow or to not disavow? I have heard back-and-forth conversations on either side on this. Does the disavow file still work? Does it not? What are your thoughts? Please let me know down below in the comments.

Thank you so much for tuning in to this edition of Whiteboard Friday. I will see you all soon. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!