Blog

A Quarter-Million Reasons to Use Moz’s Link Intersect Tool

Posted by on Aug 29, 2018 in SEO Articles | Comments Off on A Quarter-Million Reasons to Use Moz’s Link Intersect Tool

A Quarter-Million Reasons to Use Moz’s Link Intersect Tool

Posted by rjonesx.

Let me tell you a story.

It begins with me in a hotel room halfway across the country, trying to figure out how I’m going to land a contract from a fantastic new lead, worth annually $250,000. We weren’t in over our heads by any measure, but the potential client was definitely looking at what most would call “enterprise” solutions and we weren’t exactly “enterprise.”

Could we meet their needs? Hell yes we could — better than our enterprise competitors — but there’s a saying that “no one ever got fired for hiring IBM”; in other words, it’s always safe to go with the big guys. We weren’t an IBM, so I knew that by reputation alone we were in trouble. The RFP was dense, but like most SEO gigs, there wasn’t much in the way of opportunity to really differentiate ourselves from our competitors. It would be another “anything they can do, we can do better” meeting where we grasp for reasons why we were better. In an industry where so many of our best clients require NDAs that prevent us from producing really good case studies, how could I prove we were up to the task?

In less than 12 hours we would be meeting with the potential client and I needed to prove to them that we could do something that our competitors couldn’t. In the world of SEO, link building is street cred. Nothing gets the attention of a client faster than a great link. I knew what I needed to do. I needed to land a killer backlink, completely white-hat, with no new content strategy, no budget, and no time. I needed to walk in the door with more than just a proposal — I needed to walk in the door with proof.

I’ve been around the block a few times when it comes to link building, so I wasn’t at a loss when it came to ideas or strategies we could pitch, but what strategy might actually land a link in the next few hours? I started running prospecting software left and right — all the tools of the trade I had at my disposal — but imagine my surprise when the perfect opportunity popped up right in little old Moz’s Open Site Explorer Link Intersect tool. To be honest, I hadn’t used the tool in ages. We had built our own prospecting software on APIs, but the perfect link just popped up after adding in a few of their competitors on the off chance that there might be an opportunity or two.

There it was:

3,800 root linking domains to the page itself
The page was soliciting submissions
Took pull requests for submissions on GitHub!

I immediately submitted a request and began the refresh game, hoping the repo was being actively monitored. By the next morning, we had ourselves a link! Not just any link, but despite the client having over 50,000 root linking domains, this was now the 15th best link to their site. You can imagine me anxiously awaiting the part of the meeting where we discussed the various reasons why our services were superior to that of our competitors, and then proceeded to demonstrate that superiority with an amazing white-hat backlink acquired just hours before.

The quarter-million-dollar contract was ours.

Link Intersect: An undervalued link building technique

Backlink intersect is one of the oldest link building techniques in our industry. The methodology is simple. Take a list of your competitors and identify the backlinks pointing to their sites. Compare those lists to find pages that overlap. Pages which link to two or more of your competitors are potentially resource pages that would be interested in linking to your site as well. You then examine these sites and do outreach to determine which ones are worth contacting to try and get a backlink.

Let’s walk through a simple example using Moz’s Link Intersect tool.

Getting started

We start on the Link Intersect page of Moz’s new Link Explorer. While we had Link Intersect in the old Open Site Explorer, you’re going to to want to use our new Link Intersect, which is built from our giant index of 30 trillion links and is far more powerful.

For our example here, I’ve chosen a random gardening company in Durham, North Carolina called Garden Environments. The website has a Domain Authority of 17 with 38 root linking domains.

We can go ahead and copy-paste the domain into “Discover Link Opportunities for this URL” at the top of the Link Intersect page. If you notice, we have the choice of “Root Domain, Subdomain, or Exact Page”:

I almost always choose “root domain” because I tend to be promoting a site as a whole and am not interested in acquiring links to pages on the site from other sites that already link somewhere else on the site. That is to say, by choosing “root domain,” any site that links to any page on your site will be excluded from the prospecting list. Of course, this might not be right for your situation. If you have a hosted blog on a subdomain or a hosted page on a site, you will want to choose subdomain or exact page to make sure you rule out the right backlinks.

You also have the ability to choose whether we report back to you root linking domains or Backlinks. This is really important and I’ll explain why.

Depending on your link building campaign, you’ll want to vary your choice here. Let’s say you’re looking for resource pages that you can list your website on. If that’s the case, you will want to choose “pages.” The Link Intersect tool will then prioritize pages that have links to multiple competitors on them, which are likely to be resource pages you can target for your campaign. Now, let’s say you would rather find publishers that talk about your competitors and are less concerned about them linking from the same page. You want to find sites that have linked to multiple competitors, not pages. In that case, you would choose “domains.” The system will then return the domains that have links to multiple competitors and give you example pages, but you wont be limited only to pages with multiple competitors on them.

In this example, I’m looking for resource pages, so I chose “pages” rather than domains.

Choosing your competitor sites

A common mistake made at this point is to choose exact competitors. Link builders will often copy and paste a list of their biggest competitors and cross their fingers for decent results. What you really want are the best link pages and domains in your industry — not necessarily your competitors.

In this example I chose the gardening page on a local university, a few North Carolina gardening and wildflower associations, and a popular page that lists nurseries. Notice that you can choose subdomain, domain, or exact page as well for each of these competitor URLs. I recommend choosing the broadest category (domain being broadest, exact page being narrowest) that is relevant to your industry. If the whole site is relevant, go ahead and choose “domain.”

Analyzing your results

The results returned will prioritize pages that link to multiple competitors and have a high Domain Authority. Unlike some of our competitors’ tools, if you put in a competitor that doesn’t have many backlinks, it won’t cause the whole report to fail. We list all the intersections of links, starting with the most and narrowing down to the fewest. Even though the nurseries website doesn’t provide any intersections, we still get back great results!

Now we have some really great opportunities, but at this point you have two choices. If you really prefer, you can just export the opportunities to CSV like any other tool on the market, but I prefer to go ahead and move everything over into a Link Tracking List.

By moving everything into a link list, we’re going to be able to track link acquisition over time (once we begin reaching out to these sites for backlinks) and we can also sort by other metrics, leave notes, and easily remove opportunities that don’t look fruitful.

What did we find?

Remember, we started off with a site that has barely any links, but we turned up dozens of easy opportunities for link acquisition. We turned up a simple resources page on forest resources, a potential backlink which could easily be earned via a piece of content on forest stewardship.

We turned up a great resource page on how to maintain healthy soil and yards on a town government website. A simple guide covering the same topics here could easily earn a link from this resource page on an important website.

These were just two examples of easy link targets. From community gardening pages, websites dedicated to local creek, pond, and stream restoration, and general enthusiast sites, the Link Intersect tool turned up simple backlink gold. What is most interesting to me, though, was that these resource pages never included the words “resources” or “links” in the URLs. Common prospecting techniques would have just missed these opportunities altogether.

While it wasn’t the focus of this particular campaign, I did choose the alternate of “show domains” rather than “pages” that link to the competitors. We found similarly useful results using this methodology.

For example, we found CarolinaCountry.com had linked to multiple of the competitor sites and, as it turns out, would be a perfect publication to pitch for a story as part of of a PR campaign for promoting the gardening site.

Takeaways

The new Link Intersect tool in Moz’s Link Explorer combines the power of our new incredible link index with the complete features of a link prospecting tool. Competitor link intersect remains one of the most straightforward methods for finding link opportunities and landing great backlinks, and Moz’s new tool coupled with Link Lists makes it easier than ever. Go ahead and give it a run yourself — you might just find the exact link you need right when you need it.

Find link opportunities now!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Faceted Navigation and SEO: A Deeper Look

Posted by on Aug 29, 2018 in SEO Articles | Comments Off on Faceted Navigation and SEO: A Deeper Look

Faceted Navigation and SEO: A Deeper Look

The complex web of factors that determine page counts for a site with faceted navigation. It’s about the SEO, folks

tl;dr: Skip to each “Takeaways” section if you want a few ideas for handling faceted navigation and SEO. But do so at your own risk. The “why” is as important as the “what.”

If you have ever shopped for anything online, you’ve seen faceted navigation. This is the list of clickable options, usually in the left panel, that can be used to filter results by brand, price, color, etc. Faceted navigation makes it possible to mix & match options in any combination the user wishes. It’s popular on large online stores because it allows the user to precisely drill down to only the things they are interested in.

An example of faceted navigation

But this can cause huge problems for search engines because it generates billions of useless near-duplicate pages. This wastes crawl budget, lowers the chances that all of the real content will get indexed, and it gives the search engines the message that the site is mostly low-quality junk pages (because, at this point, it is).

Many articles talk about faceted navigation and how to mitigate the SEO problems that it causes. Those are reactive strategies: How to prevent the search engines from crawling and indexing the billions of pages your faceted navigation created.

This is not one of those how-to articles.

Instead, it’s about the decisions that create massive duplication and how to avoid them from the start. It’s about the seemingly innocuous UX choices and their unintended consequences. My goal is to give you a deeper understanding of how each decision affects crawlability and final page counts. I’m hoping this will give you knowledge you can use, both to avoid problems before they start and to mitigate problems that can’t be avoided.

Match Types and Grouping

Faceted navigation is typically divided into groups, with a list of clickable options in each group. There might be one group for brand names, another for sizes, another for colors, etc. The options in a group can be combined in any of a few different ways:

“AND” matching — With this match type, the store only shows an item if it matches all of the selected options. “AND” matching is most often used for product features where it is assumed the shopper is looking for a specific combination of features, and is only interested in a product if it has all of them. (e.g., headphones that are both wireless and noise-canceling)
“OR” matching — With this match type, the store shows items that match any of the selected options. This can be used for lists of brand names, sizes, colors, price ranges, and many other things. The assumption here is that the user is interested in a few different things, and wants to see a combined list that includes all of them. (e.g., all ski hats available in red, pink or yellow).
“Radio button” matching — With this match type, only one option may be selected at a time. Selecting one option deselects all others. The assumption here is that the options are 100% mutually exclusive, and nobody would be interested in seeing more than one of them at a time. Radio buttons are often used to set sort order. It is also sometimes used to choose between mutually exclusive categories. (e.g., specifying the smartphone brand/model when shopping for phone cases) Some radio button implementations require at least one selected option (e.g., for sort order), and others don’t (e.g., for categories).

The options within a given group can be combined using any one of these match types, but the groups themselves are almost always combined with each other using “AND” matching. For example, if you select red and green from the “colors” group, and you select XL and XXL from the “sizes” group, then you will get a list of every item that is both one of those two colors and one of those two sizes.

A typical real-world website will have several groups using different match types, with many options between them. The total number of combinations can get quite large:

The above example has just over 17 billion possible combinations. Note that the total number of actual pages will be much larger than this because the results from some combinations will be split across many pages.

For faceted navigation, page counts are ultimately determined by three main things:

The total number of possible combinations of options — In the simplest case (with only “AND” & “OR” matching, and no blocking) the number of combinations will be 2n, where n is the number of options. For example, if you have 12 options, then there will be 212, or 4,096 possible combinations. This gets a bit more complicated when some of the groups are radio buttons, and it gets a lot more complicated when you start blocking things.
The number of matching items found for a given combination — The number of matching items is determined by many factors, including match type, the total number of products, the fraction of products matched by each filter option, and the amount of overlap between options.
The maximum number of items to be displayed per page — This is an arbitrary choice set by the site designer. You can set this to any number you want. A bigger number means fewer pages but more clutter on each of them.

 

Test: How Does Match Type Affect Page Counts?

The choice of match type affects the page count by influencing both the number of combinations of options and also the number of matching items per combination.

How were these results calculated?
All of the numeric results in this article were generated by a simulation script written for this purpose. This script works by modeling the site as a multi-dimensional histogram, which is then repeatedly scaled and re-combined with itself each time a new faceted nav option is added to the simulated site. The script simulates gigantic sites with many groups of different option types relatively quickly. (For previous articles, I have always generated crawl data using an actual crawler, running on a test website made up of real HTML pages. That works fine when there are a few tens of thousands of pages, but some of the tests for this article have trillions of pages. That would take my crawler longer than all of recorded human history to crawl. Civilizations rise and fall over centuries. I decided not to wait that long.)

Test #1 — Simple “AND” Matching

Suppose we have a site with the following properties:

The faceted nav consists of one big group, with 32 filtering options that can be selected in any combination.
There are 10,000 products.
On average, each filtering option matches 20% of the products.
The site displays (up to) 10 products per page.
Options are combined using “AND” matching.

The above assumptions give you a site with:

4,294,967,296 different combinations of options
4,295,064,687 pages.
4,294,724,471 empty results.

The obvious: The number of pages is enormous, and the vast majority of them are empty results. For every 12,625 pages on this site, one shows actual products. The rest show the aggravating “Zero items found” message. This is a terrible user experience and a colossal waste of crawl budget. But it’s also an opportunity.

So what can we do about all those empty results? If you are in control of the server side code, you can remove them. Any option that would lead to a page that says “Zero items found” should either be grayed out (and no longer coded as a link) or, better yet, removed entirely. This needs to be evaluated on the server side each time a new page is requested. If this is done correctly, then each time the user clicks on another option, all of the remaining options that would have led to an empty result will disappear. This reduces the number of pages, and it also dramatically improves the user experience. The user no longer has to stumble through a maze of mostly dead ends to find the rare combinations that show products.

So let’s try this.

Test #2 — “AND” Matching, With Empty Results Removed

This test is identical to Test #1, except now all links that lead to empty results are silently removed.

This time, we get:

1,149,017 (reachable) combinations of options.
1,246,408 pages.
0 empty results. (obviously, because we’ve removed them)

This may still seem like a lot, but it’s a significant improvement over the previous test. The page count has gone from billions down to just over one million. This is also a much better experience for the users, as they will no longer see any useless options that return zero results. Any site that has faceted nav should be doing this by default.

Test #3 — “OR” Matching

This test uses the same parameters as Test #1, except it uses “OR” matching:

The faceted nav still has 32 filtering options
There are still 10,000 products.
Each filtering option still matches 20% of the products.
The site still displays 10 products per page.
Options are now combined using “OR” matching instead of “AND” matching.

This gives us:

4,294,967,296 different combinations of options.
4,148,637,734,396 pages (!)
0 empty results.

The number of combinations is precisely the same, but the number of pages is much higher now (966 times higher), and there are no longer any empty results. Why is the page count so high? Because, with “OR” matching, every time you click on a new option the number of matching items increases. This is the opposite of “AND” matching, where the number decreases. In this test, most combinations now include almost all of the products on the site. In Test #1, most combinations produced empty results.

There are no empty results at all in this new site. The only way there could be an empty result would be if you chose to include a filtering option that never matches anything (which would be kind of pointless). The strategy of blocking empty results does not affect this match type.

Test #4 — Radio Buttons

This test uses radio button matching.

If we repeat Test #1, but with radio button matching, we get:

33 different combinations of options.
7,400 pages.
0 empty results.

This is outrageously more efficient than any of the others. The downside of radio button matching is that it’s much more restrictive in terms of user choice.

The takeaway: Always at least consider using radio button matching when you can get away with it (any time the options are mutually exclusive). It will have a dramatic effect on page counts.

Recap of Tests #1–4:

Test
Configuration
Page count

1
“AND” matching (without blocking empty results)
4,295,064,687

2
“AND” matching, with empty results blocked
1,246,408

3
“OR” matching
4,148,637,734,396

4
Radio buttons
7,400

Takeaways

The choice of match type is important and profoundly impacts page counts.
“OR” matching can lead to extremely high page counts.
“AND” matching isn’t as bad, provided you are blocking empty results.
You should always block empty results.
Blocking empty results helps with “AND” matching, but doesn’t affect “OR” matching.
Always use radio buttons when the options are mutually exclusive.

How Grouping Affects Page Count

So far, we have looked at page counts for sites that have one big group of options with the same match type. That’s unrealistic. On a real website, there will usually be many groups with different match types. The exact way the options are separated into groups is another factor that can affect page counts.

Test #5 — “OR” Matching, Split Into Multiple Groups

Let’s take the original parameters from Test #3:

The faceted nav has a total of 32 filtering options.
There are 10,000 products.
On average, each filtering option matches 20% of the products.
The site displays up to 10 products per page.
Options are combined using “OR” matching.

But this time, we’ll redo the test several times, and each time, we’ll split the 32 options into a different number of groups.

This gives us:

Configuration
Pages
Empty Results

1 group with 32 options
4,148,637,734,396
0

2 groups with 16 options per group
2,852,936,777,269
0

4 groups with 8 options per group
466,469,159,950
0

8 groups with 4 options per group
5,969,194,867
290,250,752

16 groups with 2 options per group
4,296,247,759
4,275,284,621

The interesting thing here is that the last two tests have some empty results. Yes, all groups used “OR” matching, and yes, I told you “OR” matching does not produce empty results. So what’s going on here? Remember, no matter which match types are used within each group, the groups are combined with each other using “AND” matching. So, if you break an “OR” group into many smaller “OR” groups, you get behavior closer to an “AND” group.

Another way to put it: Suppose there are eight groups with four options each, and the user has selected exactly one option from each group. For any item to show up in those results, the item would have to match all eight of those selected options. This is functionally identical to what you would get if those eight selected options were part of an “AND” group.

If you are blocking empty results (which you should be doing anyway), then the actual page counts for the last two tests will be much smaller than is shown in this table. Before you get all excited, note that you have to have quite a few groups before this starts happening. It’s possible some site might be in a market where it makes sense to have eight groups with four options each, but it isn’t something that will happen often.

The boring but more practical observation is that even breaking the group into two parts reduces the page count noticeably. The difference isn’t huge, but it’s enough to be of some value. If a group of options that uses “OR” matching can be logically separated into two or more smaller groups, then it may be worth doing.

Test #6 — “AND” Matching, Split Into Multiple Groups

(I’m including this test because, if I don’t, people will tell me I forgot to do this one)

This test is the same as Test #5, but with “AND” matching instead of “OR” matching (and empty results are now being blocked).

Configuration
Pages

1 group with 32 options
1,246,408

2 groups with 16 options per group
1,246,408

4 groups with 8 options per group
1,246,408

8 groups with 4 options per group
1,246,408

16 groups with 2 options per group
1,246,408

Yep. They all have the same number of pages. How can this be? The options within each group use “AND” matching, and groups are combined with each other using “AND” matching, so it doesn’t matter if you have one group or several. They are functionally identical.

Takeaway

If you want to split up an “AND” group because you think it will make sense to the user or will look nicer on the page, then go for it, but it will not affect page counts.

Other Things that Affect Page Counts
Test #7 — Changing “Items per Page”

This test uses the following parameters:

The faceted nav consists of five groups, with varying option counts, and a mix of different match types.
There are 10,000 products.
On average, each filtering option matches 20% of the products.
Links to empty results are blocked.

The test was repeated with different values for “Items per Page.”

This gives us:

Configuration
Page Count

10 items per page
18,690,151,025

32 items per page
10,808,363,135

100 items per page
8,800,911,375

320 items per page
8,309,933,890

1,000 items per page
8,211,780,310

This makes a difference when the values are small, but the effect tapers off as the values gets larger.

Test #8 — Adding a Pagination Limit

Some sites, especially some very large online stores, try to reduce database load by setting a “pagination limit.” This is an arbitrary upper limit to the number of pages that can be returned for a given set of results.

For example, if a given filter combination matches 512,000 products, and the site is set up to show 10 products per page, this particular combination would normally create 51,200 pages. Some sites set an arbitrary limit of, say, 100. If the user clicks all the way to page 100, there is no link to continue further.

These sites do this because, compared to delivering pages at the start of a pagination structure, delivering pages deeper in a pagination structure create a massive load on the database (for technical reasons beyond the scope of this article). The larger the site, the greater the load, so the largest sites have to set the arbitrary limit.

This test uses the following parameters:

The faceted nav consists of five groups, with varying option counts, and a mix of different match types.
There are 500,000 products.
On average, each filtering option matches 20% of the products.
Links to empty results are blocked.

The test was repeated with different values for the pagination limit.

This gives us:

Pagination Limit
Total Page Count

5
12,079,937,370

10
13,883,272,770

20
15,312,606,795

40
16,723,058,170

80
17,680,426,670

160
18,252,882,040

(no limit)
18,690,151,025

That’s definitely an improvement, but it’s underwhelming. If you cut the pagination limit in half, you don’t wind up with half as many pages. It’s more in the neighborhood of 90% as many. But this improvement is free because this type of limit is usually added for reasons other than SEO.

Pagination Takeaways

Test 7:

For lower values, changing “Items per Page” improves page counts by a noticeable amount.
When the values get higher, the effect tapers off. This is happening because most of the results now fit on one page. (and the page count can’t get lower than one)

Test 8:

If you have a huge site implementing a pagination limit primarily for database performance reasons, you may see a minor SEO benefit as a free bonus.
If you’re not also doing this to reduce database load, it’s not worth it.

Selectively Blocking Crawlers

All of the tests so far let the crawler see all of the human-accessible pages. Now let’s look at strategies that work by blocking pages via robots meta, robots.txt, etc.

Before we do that, we need to be clear about what “page count” really means. There are actually three different “page counts” that matter here:

Human-readable page count — Pages that can be viewed by a human being with a browser.
Crawlable page count — Pages that a search engine crawler is allowed to request.
Indexable page count — The number of pages that the search engine is allowed to index, and to potentially show in search results.

The crawlable page count is important because it determines how much crawl budget is wasted. This will affect how thoroughly and how frequently the real content on the site gets crawled. The indexable page count is important because it effectively determines how many thin, near-duplicate pages the search engines will try to index. This is likely to affect the rankings of the real pages on the site.

Test #9 — Selection Limit via Robots Meta with “noindex, nofollow”

In this test, if the number of selected options on the page gets above a pre-specified limit, then <meta name="robots" content="noindex,nofollow"> will be inserted into the HTML. This tells the search engines not to index the page or follow any links from it.

This test uses the following parameters:

The faceted nav consists of five groups, with varying option counts, and a mix of different match types.
There are 10,000 products.
On average, each filtering option matches 20% of the products.
Links to empty results are blocked.

For this test, the “selection limit” is varied from 0 to 5. Any page where the number of selected options is larger than this selection limit will be blocked, via robots meta tag with noindex, nofollow.

selection limit
crawlable pages
indexable pages

0
11,400
1,000

1
79,640
11,400

2
470,760
79,640

3
2,282,155
470,760

4
9,269,631
2,282,155

5
32,304,462
9,269,631

(no limit)
18,690,151,025
18,690,151,025

In these results, both indexable and crawlable page counts are reduced dramatically, but the number of crawlable pages is reduced by much less. Why? Because a robots meta tag is part of the HTML code of the page it is blocking. That means the crawler has to load the page in order to find out it has been blocked. A robots meta tag can block indexing, but can’t can’t block crawling. It still wastes crawl budget.

You might well ask: If robots meta can’t directly block a page from being crawled, then why is the crawlable page count reduced at all? Because crawlers can no longer reach the deepest pages: The pages that link to those pages are no longer followed or indexed. Robots meta can’t directly block crawling of a particular page, but it can block the page indirectly, by setting “nofollow” for all of the pages that link to it.

Test #10 — Repeat of Test #9, But With “noindex, follow”

This a repeat of test #9, except now the pages are blocked by a robots meta tag with “noindex, follow” instead of “noindex, nofollow.” This tells the crawler that it still shouldn’t index the page, but it is OK to follow the links from it.

(I’m only including this one because, if I don’t, someone is bound to tell me I forgot to include it.)

selection limit
crawlable pages
indexable pages

0
18,690,151,025
1,000

1
18,690,151,025
11,400

2
18,690,151,025
79,640

3
18,690,151,025
470,760

4
18,690,151,025
2,282,155

5
18,690,151,025
9,269,631

(no limit)
18,690,151,025
18,690,151,025

This scheme reduces the number of indexable pages, but it does nothing whatsoever to prevent wasted crawl budget. Wasted crawl budget is the main problem that needs to be solved here, so this makes this scheme useless. There are some use cases (unrelated to faceted nav) where “noindex, follow” is a good choice, but this isn’t one of them.

Can the selection limit be implemented with robots.txt?

As shown in test #9, using robots meta tags to implement a selection limit is not ideal, because robots meta tags are part of the HTML of the page. The crawler has to load each page before it can find out if the page is blocked. This wastes crawl budget.

So what about using robots.txt instead? Robots.txt seems like a better choice for this, because it blocks pages from being crawled, unlike robots meta, which blocks pages from being indexed and/or followed. But can robots.txt be used to selectively block pages based on how many options they have selected? The answer is: it depends.

This depends on the URL structure. In some cases it’s simple, in others it’s difficult or impossible.

For example, if the URL structure uses some completely impenetrable format like base-64-encoded JSON:

https://example.com/products?p=WzczLCA5NCwgMTkxLCAxOThd

Then you are out of luck. You cannot use robots.txt to filter this, because there’s no way for robots.txt to tell how many selected options there are. You’ll have to use robots meta or X-Robots. (both of which can be generated by the server-side code, which has access to the decoded version of the query data)

On the other hand, if all filter options are specified as a single underscore-separated list of ID numbers in the query string, like this:

https://example.com/products?filters=73_94_191_198

Then you can easily block all pages that have more than (for example) two options selected, by doing this:

User-agent: *
Disallow: /products?*filters=*_*_

So let’s try this.

Test #11 — Selection Limit, via Robots.txt

This is a repeat of test #9, except now the pages are blocked using robots.txt instead of robots meta.

selection limit
crawlable pages
indexable pages

0
1,000
1,000

1
11,400
11,400

2
79,640
79,640

3
470,760
470,760

4
2,282,155
2,282,155

5
9,269,631
9,269,631

(no limit)
18,690,151,025
18,690,151,025

Takeaways

Blocking pages based on a selection limit is a very effective way to reduce page counts.
Implementing this with robots.txt is best.
But you can only use robots.txt if the URL structure allows it.
Implementing this with robots meta is less effective than robots.txt, but still useful.

Summary

Faceted navigation is one of the thorniest SEO challenges large sites face. Don’t wait to address issues after you’ve built your site. Plan ahead. Use robots.txt, look at selection options, and “think” like a search engine.

A little planning can improve use of crawl budget, boost SEO, and improve the user experience.

The post Faceted Navigation and SEO: A Deeper Look appeared first on Portent.

How to scale content production to capture the long-tail opportunity

Posted by on Aug 28, 2018 in SEO Articles | Comments Off on How to scale content production to capture the long-tail opportunity

Here’s something we all know so well that nobody needs to say it anymore: content is king.

We know it because we’ve been hit over the head with the phrase more times than you can shake a page view at. There’s no getting away from it: producing high-quality, engaging content and unique copy is vital for SEO, brand awareness, and affinity.

There will be few digital marketers out there who are not painfully aware of the challenge. When resources, time, and money are (more likely than not) limiting factors, how do you produce large amounts of content to a high enough standard to be effective?

This can be especially true if you or your client is a business with many different product lines, or in multiple locations around the world. The potential topics are infinite, red tape acts as a bottleneck, and copywriters can be overworked and expensive.

The good news is that with the rising popularity of remote working and digital nomads, partnered with a solid strategy and process, you don’t have to make the impossible choice of quality or quantity.

Use a network of freelancers

Perhaps you have a short-term project in the pipeline, or your client suddenly wants to dramatically increase the amount of content in production. What do you do? Hiring a team of copywriters is expensive.

The freelance market, however, is competitive, and these days you don’t have to compromise quality for the sake of cost. Digital nomads are highly-skilled, maybe even multi-lingual, and are likely to be based in countries where the cost of living is low.

Of course, this might not work for you if you need writers based in your market, in which case you could use your international freelancers for other means. Have you got a killer strategist on your books, or someone who speaks four languages who could translate and localize your copy using their knowledge of your markets? Make use of their skills.

It goes without saying that good communication is central to making it work with freelancers. Make yourself as available as possible to your writers and remind them again and again that there is no such thing as a silly question. Building a personal rapport is vital—video calls are great for this, and often far quicker than trying to painfully explain something over email. Apps such as Google Hangouts will become your best friend, for when a simple question requires a quick answer.

With freelancers you have the opportunity to not only become more cost-effective, but to make time zones work for you. This is the key: whilst you’re sleeping, some of your freelancers will be working. Manage this effectively and the amount you produce will rapidly increase, without compromising on quality.

Establish a process

It sounds absurdly simple, but if you don’t set up a clear, defined process, then you’re at very real risk of not achieving the core goals of the project. Common pitfalls include repeating work (or producing the wrong content due to poor briefs), missing deadlines, and inefficiently handling budgets.

It may take some time to set up, but it will undoubtedly pay off once it’s up, running, and ticking along by itself whilst you dedicate yourself to other tasks.

Firstly, one of the most useful things you can do is to spend some time getting your briefs watertight. Provide key details about the client, background information for the task such as the target audience, and clearly explain how this work fits into the wider strategy. Outline the deliverables clearly, and provide a step-by-step guide and examples if necessary.

Brief templates can help with this, especially if you’re producing different types of hygiene content for the same client. It will be worth it when you receive the work back exactly as needed, with minimal questions in the process, and future you will thank you.

Secondly, I strongly advise setting up trackers, because let’s face it: the benefit of a good Excel document cannot be underestimated. Create them so you know what stage your project is at from a glance and include pricing information and details of your freelancers. These trackers should essentially be a one-stop-shop for everything you need to know about the project. This will be invaluable not only for measuring where you are in the process but also for reporting.

Project tracking and management services such as Trello can be a godsend. Make use of them. Here at Croud we have our own proprietary technology, Croud Control, which allows us to manage huge content projects flexibly, with full visibility and control over every aspect of each project.

If this all sounds a little exhausting, why not use a trusted freelancer to manage this process for you? That way you only need to brief one person (although admittedly you will probably need to do a deep-dive), and providing you have regular check-ins along the way, you will only need to get involved at the final stage.

QA, QA, and QA again

Speaking of the final stage: check everything. Then check again.

It is unavoidable that your copywriters will make mistakes, as they are human beings. It’s also possible that your proofreaders will miss the odd spelling mistake here or there. This is the reason why I operate on a two-stage QA process at a minimum.

If your client is a multinational company, you may be required to translate or localize your copy into several different languages. It goes without saying that native speakers should perform the QA on this type of work, especially if the copywriter was a non-native speaker.

Providing your freelancers with feedback is crucial to the success of content projects, aside from just being a decent thing to do. After all, everyone wants to do a good job and more likely than not, wants to know how they could do it better.

Tight budgets mean you might have to get creative with how you manage it. This QA process allows me to do just that. If a new, potentially unexperienced copywriter with good writing skills and low hourly rate does the bulk of the work, the more skilled writers who are almost definitely more expensive can be lined up to proofread, check tone, and generally make sure it is up to scratch, in half the time it took to write it. Just make sure they don’t end up re-writing the work. Empower them to provide constructive feedback directly to your copywriters, and effectively train them up.

If your QAs pick up on the same mistakes being made repeatedly, allow your copywriters the opportunity to review their edits. If they can actually see the corrections being made, they are more likely to bear them in mind when they write for you again. If fewer edits are required, then congratulations, you have made the process even more efficient and cost-effective.

Summary

Creating high-quality, unique copy and content on a large scale is never going to be easy, but it doesn’t have to be painful. With a bit of legwork at the beginning to establish a well built process, and by making the most of a network of freelancers, it has the potential to be a breeze.

Not only that, but you and your clients will undoubtedly reap the commercial rewards of your hard work. Using exactly this process, together with our global network of 1,700+ freelancers known as ‘Croudies’, we were able to produce city-specific landing page copy for a client with hundreds of locations. This work led to a 113% increase in organic traffic, coupled with a 124% uplift in domain visibility.

And the key to success? Engage your writers at every available opportunity, so they don’t feel like a cog in a machine. Provide them with valuable feedback and help them whenever you can. This will likely not only improve your enjoyment of the project, but you’ll also probably find that they are more willing to help with future work. And when the whole project goes off without a hitch and you receive fantastic reviews (because why wouldn’t you), tell them of the good news and allow them to share in your success.

How I Boosted My Rankings Without Creating Content or Building Links

Posted by on Aug 28, 2018 in SEO Articles | Comments Off on How I Boosted My Rankings Without Creating Content or Building Links

How I Boosted My Rankings Without Creating Content or Building Links

I know what you are thinking, this isn’t impossible.

Because the more content you have and the more links you have, the higher your rankings will be.

Although that is true, it doesn’t mean that content marketing and link building are the only ways to increase your rankings.

It doesn’t matter what update Google rolls out, I’ve found that there are a few hacks that consistently work to boost your rankings without creating more content or building more links.

So, are you ready to find out what they are?

What does Google want to rank at the top?

Before I get into the exact “hacks” and tactics that can boost your rankings, I want to first help you change the way you think about SEO.

Do you think Google really cares about on-page SEO and link building?

Sure, it matters to some extent, but that’s not what Google cares about the most.

Google wants to rank websites that people love. If they ranked websites that you hated, then you would slowly stop using Google.

And if people stopped using Google, then there would be fewer people to click on their ads, which means they would make less money.

That’s why Google cares about what you think and they ideally want to rank the websites that you love.

Now let’s dive into some hacks that will make people love your site, which will boost your rankings.

And don’t worry… I am not going to give you some fluffy tactics, I have data to back up everything. 😉

Hack #1: Optimize your click-through-rate

Let me ask you this:

If 10,000 people performed a Google search for the term “SEO” and clicked on the number 2 listing instead of the number 1 listing, what would that tell Google?

It would tell them that the number 2 listing is more relevant and that Google should move that listing to the number 1 spot.

Rand Fishkin ran an experiment where he told all of his Twitter followers to perform a Google search for the term “best grilled steak” and to click on the first listing, hit the back button, and then click on the 4th listing.

Within 70 minutes the 4th listing jumped into the top spot.

And that page even started to rank at the top of page 1 for the term “grilled steak”.

The ranking eventually slipped back down because people didn’t really feel that the listing was that great compared to some of the other listings.

Instead, it only climbed because Rand has a loyal following and everyone helped trick Google to believe that it was more relevant (at least in the short term).

But this should give you a sense that Google cares what you think. So much so that they will adjust rankings in real time because they don’t want to show you pages that you feel are irrelevant (no matter how many backlinks the page has or how well its on-page code is optimized).

And Rand wasn’t the only person who tested out this theory. It’s been done a countless number of times and each time it produced similar results.

You want people to click on your listing more than the other ones out there. It’s that simple.

If you can generate more clicks (in a legitimate way) than the listings above you, eventually you’ll notice your rankings climb without having to write more content or build more links.

So, how do you get more clicks?

Well, you have to adjust your title tag and meta description tag to be more appealing.

Anytime you perform a Google search, you see a list of results. And each result has a title, URL, and description:

The link part is the title (also known as the title tag), then there is the URL (which is green in color), and lastly, there is the description (black text… that is also known as the meta description).

If you are running a WordPress blog, you can easily modify your title tag and meta description using the Yoast SEO plugin.

There are a few ways you can generate more clicks on your listing over the competition:

Include keywords – people tend to click on listings that include the keyword or phrase they just searched for. Make sure you are using the right keywords within your title and description (I will get to this in a bit). This may sound basic, but when your web pages rank for thousands of terms, which one do you include in your 60-character title tag?
Evoke curiosity – titles that are super appealing tend to generate clicks. For example, if the keyword you were going after is “green tea,” a good title would be “11 Proven Benefits of Green Tea (#6 Will Shock You)”. I know it may seem a bit long, but it works because a lot of people will wonder what number 6 will be.
Copy magazines – anytime you see a magazine, you’ll notice that they have appealing titles and headlines on the cover. A lot of their titles contain “how to” or are list oriented. Look at magazines for inspiration.

Improving your search listings isn’t rocket science. Where most people mess up is that they pick the wrong keywords or they are terrible at writing copy. Remember, humans are reading your title tag and meta description tag, so they need to be appealing.

If you are struggling writing appealing copy, read my ultimate guide to copywriting.

Now let’s go over the exact steps you need to take to get more clicks.

The first step is to use Google Search Console.

Log into Google Search Console, then click on “Search Traffic” and then click on “Search Analytics”:

You’ll see a page that looks something like this:

Scroll back up to the top and click on the “pages” radio button and “CTR” checkbox:

You’ll see a list of results sorted by your most popular URLs and their respective click-through-rate (also known as CTR):

Look for pages that have high traffic but a CTR of less than 5%.

Click on one of the listings with a CTR of less than 5% and then click on the “queries” radio button:

You’ll then want to look for the keywords with the highest amount of “clicks” and the lowest CTR.

Those are the keywords you want to focus on in your title tag and meta description.

Remember, your title tag is limited to roughly 60 characters, which means you won’t be able to fit more than 2 or 3 keywords.

So, you want to pick the keywords that typically have the most clicks. They should also have a low CTR because you selected pages with a CTR rate lower than 5%.

By adjusting your title tag and meta description to include the right keywords and by evoking curiosity, you’ll be able to increase your clicks. This will get you more search traffic in the short run and boost your rankings over time.

Here are 3 tests that worked well for me when I adjusted my title tag:

I noticed I was getting a lot of traffic for the term “marketing digital” from countries outside of North America on one of my posts.

So, I adjusted my title tag from saying “digital marketing” to “marketing digital” which took my CTR from 3.36% to 4.45%. It also increased my search traffic by 1,289 visitors a month.

With the key phrase “social media marketing,” I adjusted my title tag based on an idea I got from a magazine. My CTR went from 2.38% to 2.84%. In total, that increased my traffic by 932 visitors a month.

With my social media marketing title tag, I added the phrase “step-by-step guide.”

This lets people know it is a how-to related post and it is action oriented. I also added the word “social media” a few times within the meta description.

And with the query “Google AdWords,” I noticed that Google announced that they are switching their ad platform name from Google AdWords to Google Ads, so I did the opposite and focused more on the term “Google AdWords” because very few people knew about the name switch.

This helped drive an extra 1,355 visitors per month.

I’ve also had instances where the changes I’ve made had hurt my Google traffic.

So, whenever you adjust your title tag and meta description, mark that date down and look at the data within Google Search Console after 30 or so days to see if it hurt or helped.

If it hurt, revert it back and wait another 30 days. It can hurt your rankings if you continuously test. So when you have a losing variation, no matter what, wait 30 days as it will stabilize your rankings.

If the change helped boost your CTR and rankings, then you are off to a good start.

Now that you’ve optimized your click-through-rate, it’s time for you to optimize your user experience.

Hack #2: Show people what they want when they want it

If you go back to the experiment Rand Fishkin ran above, you’ll notice he told people to click the “back” button.

You don’t want people going to your site and clicking the back button… it will hurt your rankings.

People tend to click the back button because they don’t like what they see. If you can optimize your website for the optimal user experience, people will be less likely to click the back button.

I do this through 2 simple steps.

The first is to use Qualaroo and survey people. By asking people (right when they are on your website) a simple question of “how can I improve this page,” you’ll get tons of ideas.

You can even use Qualaroo to find out why people are visiting your website, which again will help you understand the type of people visiting your site. This will allow you to tailor your experience to them.

I ran a Qualaroo survey on my main blog page. The biggest feedback I got from you was that it was hard to find the exact content you were looking for.

And I know why too. It’s because I have marketing related content on everything. From ecommerce to SEO to content marketing…

I decided to try something out where when you land on the blog page, you can select the type of content that piques your interest and then all of the content gets tailored to your needs.

I also ran a Crazy Egg test to ensure that you like the change I made. Based on the Crazy Egg heatmap below, you can see that it was successful.

The bounce rate on my blog page dropped by 21% as well. 🙂

I then looked at the Crazy Egg scrollmap to see which elements/areas of the page have the most attention. This helped me determine where I should place the content filtering option.

The Crazy Egg scrollmap of my blog page shows that the content filtering option generates 70% of the page’s attention.

Placing the filtering in a place where there is a lot of attention ensures that I am giving you what you need in a place that is easy to find.

After you optimize your user experience, you want to focus on building a brand.

I recommend that you look at the pages on your site with high bounce rates and consider running this process in order to improve the user experience. When selecting the pages, make sure you are also picking pages that have decent traffic.

Hack #3: Build a brand

If you build a brand like Facebook or Amazon or any of the popular site, you’ll rank higher.

Eric Schmidt, the ex-CEO of Google, once said:

Brands are the solution, not the problem. Brands are how you sort out the cesspool.

I ran an experiment, which helped build up my brand and my search traffic skyrocketed (unintentionally).

My traffic went from 240,839 unique visitors per month in June 2016:

To 454,382 unique visitors per month by August 2016:

Once I realized the power of branding, I started a podcast called Marketing School, and I started to publish videos on YouTube, Facebook, and LinkedIn multiple times per week.

This has led me to generate 40,412 brand queries per month:

I’m even getting 3,806 brand queries per month on YouTube alone:

But as you know, producing good content doesn’t guarantee that your brand will grow.

Even if you build tools like me and release them for free (like what I did with Ubersuggest), it still won’t guarantee success.

But the one thing I have learned that works is the rule of 7.

When someone hears your message 7 times or sees it 7 times, they are more likely to resonate, build a connection, and continually come back.

So how do you get people to come back to your site?

The simplest solution that I’ve found to work is a free tool called Subscribers.

It leverages browser notifications to get people to “subscribe” to your website. It’s better than email because it is browser-based, which means people don’t have to give you their name or email address.

And then every time you want to get people to come back to your website, you simply send them a notification.

Look at how I’ve gotten over 42,316 people back to my site 174,281 times. That’s roughly 4 times per person.

Based on the rule of 7, I only have 3 more times to go. 😉

The way I use Subscribers is that I send out a notification blast every time I release a blog post.

The push looks something like this:

And instantly I’m able to get people back to my site:

When you start using Subscribers you won’t see results right away. It takes time to build up your subscriber base, but it happens pretty fast.

Typically, you’ll generate a browser notification subscriber three times faster than an email subscriber.

Conclusion

If you only focus on things like on-page SEO, link building, or even blogging, you won’t dominate Google.

Why?

Because that is what everyone else focuses on. You have to do more if you want to beat the competition.

By doing what’s best for the user, you’ll have a better chance of beating everyone else.

Just look at me, I do what every other SEO does plus more. Sometimes this causes my traffic to dip in the short run, but in the long run, it generally climbs.

From creating compelling copy so people want to click on your listing, to optimizing your user experience, to building a brand… you have to go beyond the SEO basics.

SEO has become extremely competitive. 5 years ago, it was much easier to rank at the top of Google.

If you use the 3 hacks above, here’s how long it will typically take to notice results.

Optimizing title tags – assuming you run successful tests, you can see small results in 30 to 60 days. Over time the results get even better.
Improving user experience – making your user experience better will instantly improve your metrics such as bounce rate, pageviews per visitor, time on site, and conversion rate. As for search rankings, it does help, but not instantly. Typically, it takes about 4 to 6 months to see results from this.
Brand building – sadly it takes years. Sure, tools like Subscribers will instantly grow your traffic, but it won’t impact your search rankings right away. You have no choice but to build a brand.

So which one of these hacks are you going to test out first?

The post How I Boosted My Rankings Without Creating Content or Building Links appeared first on Neil Patel.

Yoast SEO 8.1: Gutenberg part 2, introducing the snippet preview

Posted by on Aug 28, 2018 in SEO Articles | Comments Off on Yoast SEO 8.1: Gutenberg part 2, introducing the snippet preview

Yoast SEO 8.1: Gutenberg part 2, introducing the snippet preview

Two weeks ago, we launched Yoast SEO 8.0. In it, we shipped the first part of our integration with Gutenberg: the sidebar. That release was the foundation on which we are building the next parts of our integration with the new WordPress editor. In Yoast SEO 8.1, we introduce part 2: a Gutenberg-proof snippet preview. Also, a much better experience in the content analysis thanks to webworkers!

Optimize for synonyms and related keywords and prevent broken pages on your site with Yoast SEO Premium! »

$89 – Buy now » Info

Gutenberg, meet the Yoast SEO snippet preview

Yoast SEO 8.0, unfortunately, had to make do without a snippet preview inside Gutenberg. There were still some kinks to iron out before we could add that snippet preview to our WordPress plugin. The code for that new modal — the pop-up screen — had to be written from the ground up, exclusively for Gutenberg. That code has now been added to Gutenberg’s core so every WordPress developer can make use of the modal inside the new editor. How awesome is that!

Here’s what snippet preview pop-up inside Gutenberg looks like:

You see that it looks just like the regular Yoast SEO snippet preview. It has all the features you know and love, like the true-to-life rendering of your snippet on both mobile as well as desktop screens, SEO title field editor with snippet variables, slug editor and meta descriptions, also with snippet variables. To open the snippet preview, you simply click on the Snippet Preview button in the Yoast SEO Gutenberg sidebar.

Another cool thing now available in Gutenberg is the Primary Category picker. This has been a staple for many years in Yoast SEO. It lets you make and set the primary category for a post. This will be automatically selected whenever you make a new post. We will port more features over to Gutenberg shortly.

What’s next

We, of course, have big plans for Gutenberg. There’s still a lot to be done and not everything we’re dreaming up is possible right now. Step by step, we’re turning Yoast SEO and Gutenberg into a dream combination. We’re not just porting over existing features to the new Gutenberg, but actively exploring what we can do and what we need to do that. In some cases that means we have to develop the support inside Gutenberg’s core ourselves, this way loads of developers can benefit from the results as well.

Speeding up the content analysis with webworkers

Speed = user experience. To keep Yoast SEO performing great, we added a dedicated webworker to our content analysis. Webworkers let you run a script in the background without affecting the performance of the page. Because it runs independently of the user interface, it can focus on one task and does that brilliantly. Webworkers are very powerful and help us to keep Yoast SEO stable, responsive and fast even when analyzing pages with thousands of words of content. Try it!

The update is available now

Yoast SEO 8.1 has a lot of improvements behind the scenes that should drastically improve how the plugin functions. We are dedicated to giving you the best possible user experience, while also improving our current features and laying the groundwork for new ones. And not to forget that new WordPress editor, right? Update and let us know what you think!

Read more: Why you should buy Yoast SEO Premium »

The post Yoast SEO 8.1: Gutenberg part 2, introducing the snippet preview appeared first on Yoast.

The evolution of search: succeeding in today’s digital ecosystem – part 2

Posted by on Aug 28, 2018 in SEO Articles | Comments Off on The evolution of search: succeeding in today’s digital ecosystem – part 2

In the first part of our discussion on the evolution of search, we looked at the change in customer behaviors, which has led to a struggle between search engines and apps to remain relevant.

We also started to dissect key parts of the new digital ecosystem, looking in detail at the most obvious manifestation of these indirect answers, the information that powers these, and the change in mindset required to capitalize on the opportunities direct answers present. In this second part, we will consider further the outputs of the fundamental changes to search—and what this means for SEO as a channel in the future.

Voice is important, but we’re looking at it the wrong way

It wouldn’t be right to consider the evolution of search and featured snippets without discussing voice search. Many are looking to this as the new frontier for search, doubling down on strategies to become the answer to questions that people ask. Voice search is undoubtedly taking off in a big way, with 2016 being a turning point in the growth of the channel, but there are two challenges “voice marketers” will face: firstly, there is still a stigma to using voice in public—consumers may use quick commands, but they are yet to embrace the full capabilities of smart assistants among other people.

Secondly, smart speakers are becoming a part of people’s homes in a big way, with an estimated 40% of UK homes due to have an Amazon Echo in 2018. Despite this, companies will struggle to convince their audiences to receive unsolicited branded messages without permission. This is more of a problem in the wake of GDPR and claims of smart devices “listening in,” and I expect more tolerance to come in the future.

Until that point, it doesn’t matter if you’re the answer; users won’t know who has delivered the results they are listening to.

A much bigger opportunity in voice, although falling a little outside of the search marketer’s remit, are “skills.” When the app store launched, many of the first apps were utilitarian or games; the idea of a “branded” app was yet to be developed. However, as smartphones became ubiquitous, the prevalence of apps increased. I believe the same will be true of “skills.” For now, many of these provide data that the assistants cannot store first-hand, such as bus times and weather information. Over time, however, these could provide a branded experience for more conventional voice queries. Already, skills allow brands to provide a personalized response across voice. Importantly, as skills must be linked, these are solicited; or, put simply, you can brand the answers you give to user questions in an agreed format. Right now, this is a powerful tool; in the future, this will be a game-changer.

For those still looking to own the answers, owning the data feeds is key. While you can optimize for this in the same way as featured snippets, it’s harder to convince voice speakers, whose sole result has to be infallible or users will stop asking, that you are the one result to rule them all. This is why I believe Yext’s recent announcement that they will be pushing information directly to Alexa is as critical a change to search marketing as the launch of Penguin or Panda. For the first time, key data and knowledge feeds can be directly inputted into and brands can not only influence the information that Google, Amazon, Microsoft, and other platforms have on them (which is currently the case with answer optimization), they can own the narrative entirely.

As search engines look to promote results directly in search (whatever the format), this is a giant step forward towards the digital ecosystem of the future and should not be underestimated.

Speed and mobile are intrinsically linked; new formats will enable this

We’re all bored of hearing the phrase “content is king”—in fact, the “is king” moniker has been done to death. “Speed is king,” this probably does not carry the weight it needs to; and this is a shame, because it runs the risk of overlooking a crucial part of web marketing in 2018. From a pure SEO perspective, speed is now linked to improved visibility, in the same way that the interstitial ad penalty penalized sites for pop-ups.

However, if you’re blocking pop-ups or reducing your page load times for search traffic alone, you are firmly missing the point. This isn’t an “SEO thing.” This is a user experience essential, based on the changing demands of the digital-savvy customer in the modern age of technology; an audience that expects to quickly access the content they wish to furiously consume. Any delays or blockers in this process can be disastrous—not only to the brand, but to search engines as a whole.

Popular apps provide seamless, tailored experiences to their users; to stay as information leaders, this has to be replicated across search. A slow response, even if it’s not directly the fault of the provider, only serves to drive users away.

This is why Google is backing new formats; from accelerated mobile pages to progressive web apps and all device-focused changes (including in their index), the search giant is looking to improve the quality of the mobile web, a challenge it is uniquely well-positioned to undertake. As SEOs we should be embracing this—it’s better for our users. Yet we are limited by questions around tracking and data integrity (which Google is looking to change) and by the main engines’ ability to crawl and index JavaScript content, a programming language that will be key to bringing about the change that Google, Bing, and other providers need to stay relevant to their users.

For now, the biggest threat is mobile and apps; as other emerging technologies become more widely adopted, particularly in the immersive experience space, both the web and search engines will need to catch up to survive. And I believe that not only is it the responsibility of SEOs to drive forward these changes, it is both absolutely in our interest to do so and intrinsic to the continuation of investment in our channel. 

The future is bright, but SEO will never be the same

With the rise of apps and Google looking to push answers directly to users, reducing the importance of the website in the digital ecosystem, you could argue that the importance of SEO activity is dwindling. This would be a myopic view of the future; while the basis of our activity roadmap may change, there will be a requirement for optimization. As the major algorithm launches earlier in the decade fundamentally changed the way we operate and skills required to succeed in the channel, so too will the behavioral changes we are currently experiencing. As we have always done, we will adapt.

In his 2016 Brighton SEO talk, Jono Anderson argued that the digital marketer of the future will not need to learn new skill-sets but combine existing ones. For search marketers, this means focusing on specific areas of knowledge where we can be the most effective, instead of trying to know it all as we currently do. Most digital agencies have already separated content and SEO teams into two different, yet complementary work streams. Structuring technical and local experts into teams of their own is becoming more popular and in doing so, allows the marketers within them to shape their abilities around the requirements and objectives of their specialism.

Looking ahead, there will always be a place for search engines in the digital ecosystem, although their importance to the whole is yet to be decided. As such, there will be a continued opportunity (and need) for search marketing. The SEO of the future may be a very different person than now and the focus of digital agencies will be split between building brands, building web experiences, and structuring information to be easily understood by data feeds. But until agencies truly leave the ranking factors of the past behind and fully support this new digital world, powered by technology, convenience and customers, it will be at perpetual risk of becoming irrelevant to our audiences.

Writing great social media content for your blog

Posted by on Aug 28, 2018 in SEO Articles | Comments Off on Writing great social media content for your blog

Writing great social media content for your blog

I’ve always felt lucky blogging for Yoast.com. As I wrote before, I have an entire blog team that makes sure my post gets scheduled, is free of grammar or spelling errors and they publish it on social media. So I ‘only’ had to come up with an idea, which the team often helped me with, and type the post. I decided that if I ever were to outsource things on my own blog, it would be things like promotion and social media.

My struggle with social media

And then the inevitable happened. After I finished my previous post, I got a message: “Caroline, from now on, please write your own introduction for Facebook, Twitter, and the newsletter. Here’s some information for you. If you have any questions, let us know!” Hold on! Yes, I have questions! Starting with: “How do I do this?” and: “Do you have any idea how difficult it is to write short messages? There’s a reason I’m not active on Twitter!” And, so began my struggle, and search, for the ultimate social media messages.

Optimize your site for search & social media and keep it optimized with Yoast SEO Premium »

$89 – Buy now » Info

Because truthfully, I’d rather type a 2000 word essay than one sentence for Facebook. When you’re reading this, I’ve already grabbed your attention. You’ve already made it down to this point in my post, which means that you want to read my message. On social media, I can’t spend over a hundred words to make my point. If I do, you might not click, you might scroll past my message and you’ll never see my post at all.

And that’s how I started my two-day research. Two days? Yes. I, of course, started rather late with this blog post and had almost no time to conduct proper research. So, all the information in this post is based on my common sense – and I’ll teach you how to use your common sense too! Oh, how amazing my job is. Truly. Well, apart from having to write my own social media messages now.

To click or not to click

When do you click on a Facebook message? When do you hit the like button? When do you leave a reply? And when do you take the effort to go to someone’s profile and visit their domain through Instagram if there’s a ‘link in bio’ message underneath a photo? Those questions were the most important for me the last few days, to figure out what the perfect message entails. To find the answer to these questions, you need to know who your audience is.

For my blog, that’s a rather easy answer: the goal audience for my blog is me! And people like me, of course. But, I started my blog because I love writing. I’m right in the middle of my audience: young mothers (and fathers, of course) who are struggling with parenthood and want reassurance that others are struggling too. I want people to laugh at my stories, but also to take their struggles and life a little less serious, in order to enjoy life more.

Experimenting on different platforms

While people who visit my blog always tell me I have a great sense of humor – except for my husband, he still claims I have no humor at all – my Facebook page didn’t reflect my blog at all and come to think of it, I didn’t even like Facebook.

I started experimenting on Instagram: my photos were more blunt, I used a lot of hashtags (thirty hashtags seems to be the maximum) and I treated Instagram as if I was talking to my best friend. Immediately, my engagement went up. People responded to my photos with more than just a heart, they actually left messages! I started to get to know my audience more and more, and then a few days ago I decided I’d use the same strategy on Facebook.

I took a notebook and wrote down when I was interested in a Facebook post from another company, and when I scrolled past. And, although this is personal (and not perfect) research, this works for me, since I am a reflection of my own audience. I made notes on the posts I clicked on: what was the message they wrote? What was the title of the post? Did the image appeal to me? And when did I decide not to click on a post?

I found out that I click the link if these three aspects: text, title, and photo of the post, appeal to me. There are messages I saw multiple times but I didn’t click them, because the Facebook image wasn’t appealing enough, or the leading text was too vague or didn’t catch my attention.

Learn how to write awesome and SEO friendly articles in our SEO Copywriting training »

$199 – Buy now » Info

How to find your voice on social media

It’s important your social media reflects your website. If you write for solo travelers who are 20 years old, it’d be strange if your social media posts are more appealing to people who’d rather stay in and haven’t taken a vacation in the last 20 years. Just like you once found your voice for your blog, you need to find your voice on social media too. And you’ll have to experiment before you find it. Here’s how to experiment:

Realize that your social media are part of your brand

Facebook, Instagram, and other social media are extensions of your blog. Try to find the reason why you follow someone on Instagram, hit the like button on Facebook or retweet a message on Twitter. It’s probably because you feel connected to someone or to the brand. Those social media accounts should reflect the blog, in this case.

Write different introductions

By writing and rewriting your Facebook messages a few times, you will eventually find the voice that fits your brand. You can’t be as elaborate on Facebook or Instagram as you are on your blog. You need to catch people’s attention and get them to click that link to your website.

With Facebook, you can easily re-post a post that’s a couple of months old. Check which posts performed less: you can look that up on your Facebook page under ‘Statistics’. Check the accompanying message you wrote, try to rewrite them and see if you can gain more clicks.

It’s all about strategy

As much as you need a blog planning, you also need a social media planning and a strategy. If you post on Facebook only once a week, you probably won’t reach a lot of people. However, if you post once or twice a day, you’ll see your reach going up. Those posts don’t always have to be a link to your blog, especially not when you only blog every other day or once a week. Share images, ask questions, share links to other blogs in your niche or share quotes. Look at your competition and try to find a new angle to implement on your social media profiles.

Read more: How to use social media »

And now it’s time for me to write a nice introduction for social media so you’ll actually end up clicking and reading this message. Wish me luck. Oh and please drop your tips on me as well! You have no idea how much I learn from the comments you leave on my blog posts!

Keep reading: Social media strategy: where to begin? »

 

The post Writing great social media content for your blog appeared first on Yoast.

Search basics: the difference between URL structure and Information Architecture

Posted by on Aug 27, 2018 in SEO Articles | Comments Off on Search basics: the difference between URL structure and Information Architecture

Search basics: the difference between URL structure and Information Architecture

I’ve recently noticed some confusion around the industry on the differences between URL structures and Information Architecture (IA). I thought it was worth clarifying a few points and giving you all some language that is useful when talking about the differences.

Pre-requisites – if you aren’t familiar with the following elements, it’s worth reading these primers before you dive in deeper here:

What is a URL?
What are the SEO considerations for URLs?

Google’s guidelines for URLs

What is Information Architecture?

That article focuses specifically on IA considerations for SEO, for a broader overview of IA more generally, this is a great resource

The specific thing I want to clarify is the differences between decisions about the path in your URLs and decisions about your IA as this is where I often see a ton of confusion.

Decisions about URL structures and decisions about the IA of your website both involve questions about grouping and hierarchies of pages. For example:

URL: should the path of an individual product be:

/product-slug
/products/slug
/products/category/slug
/products/category/sub-category/slug

IA: how should we group our product pages and link between them:

Should there be a link “up” to the parent category?
How many “levels” of sub-category page types should there be?
How do we link between sibling products in the same (sub-)category?
How many products can we reach in (e.g.) 3 clicks from the homepage?
How should we handle facets?

The fact that both concern groupings and hierarchies has led too often to people misinterpreting IA questions as URL questions.

From an SEO perspective, most of the grouping and hierarchy questions we care about are questions about which pages should exist (e.g. should there be an indexable page for “red men’s shoes above a size 11”) and how should our pages be linked together (both from a crawling perspective – thinking about considerations like click depth, and from a ranking perspective – thinking about considerations like internal link equity).

Unfortunately, I’ve too often seen these IA questions expressed as URL considerations, and this can lead to advice that is less effective than it should be. For many of these IA questions, you can come down on either side of the IA decision with either URL structure:

You can choose to have a (sub-)category page type without necessarily having the (sub-)category appear in the URL as a keyword or as a folder (and indeed, there are times that this is a good idea if products can be in multiple categories or if they often move category)
You can choose to link “up” the hierarchy or “across” to sibling products with or without those link targets sharing elements of their paths (e.g. a product page at /product-slug can link to a parent page at /category even if it doesn’t have a URL of /category/product-slug)

In general, the IA considerations are more important than the URL considerations, and you should focus on the Information Architecture with higher priority. It’s IA that governs the flow of internal link equity (PageRank) and also that governs crawlability and discoverability of different pages and page types. In general when we talk about pages being “higher in the IA” or “closer to the homepage” we mean in click-depth rather than folder structure. You can’t fix IA issues with URL changes alone. For whatever IA decisions you make, you can then make decisions around how to structure the paths for your pages’ URLs to make the best trade-off you can between the constant tensions:

It can be good to have appropriate keywords in the path (for users and search engines(*))
Human-readable paths are helpful (and structure can help with reporting)
BUT shorter paths are generally better than longer

(*) see “the importance of keywords in URLs” below

Probably the only real constraint that paths create for IA is that if you go down the path of having nested folders, that will generally imply the existence of pages at each level of folder. In other words, a page at /level1/level2/level3 implies that /level1/level2 will also exist as a real page (as much for users as anything else).

Summary of IA vs. URLs

Information Architecture decisions for search performance focus on:

What (kinds of) pages should exist on my site?
How should our pages and page types link to one another?

You may choose to group pages of the same page type together by, for example, placing them in a folder, but this is an independent decision about URL structure. In general, URL structure decisions are less important than IA decisions.

How important are keywords in URLs?

At MnSummit, my colleague Rob heard Google representative John Mueller say that there was no SEO need to translate URLs for foreign language sites. This surprised me, because (unless Google is already translating all inputs and outputs) this implies that keywords in the path make no difference in search either. I would have thought that all else being equal a page called /shoes/red would outperform /products/12512 for a whole variety of reasons.

So: I’m inclined to add this to the list of things that Google says are true that may be technically true, if read narrowly enough, but are unhelpful in the real world. My most charitable reading is that John is saying something like “Google does not have a specific element of the algorithm that checks language in page paths”.

So, although the primary focus of this post has been IA considerations I do think that it’s sensible to have some element of descriptive keyword in your URLs because although we can’t be certain it’s an explicit ranking factor:

Above everything else, it’s good for usability
Google says this explicitly in their guidelines

It is a keyword-relevance signal of some kind (however weak)
Google’s guidelines only argument is the usability one above, but that whole guideline section is explicitly about performance in Google, with primarily technical advice, and it seems reasonable to me to believe that they are saying we prefer (and rank better) pages like this because users prefer them
We do know explicitly that URL, path, and filename are explicit keyword signals for some file types

It is more likely to result in relevant anchor text in external links
A point made even in Google’s own SEO guide (in a section entitled “Simple URLs convey content information” which supports the arguments above as well)

I would expect a better click-through rate from the search results when it does rank

And for sure try to have URLs in the correct target language. Regardless of whether you agree with me or John Mueller about the SEO benefit, I think we both agree that your users would prefer URLs in their own language.

A note on changing URLs and moving content

There are always risks to moving content – even with well-implemented redirects and no mistakes – and so you should only undertake URL changes with care. In general, IA changes are more reversible as things are more likely to go back to how they were if you undo the change while the nature of a 301 (“permanent”) redirect is that it should signal that things are not going to change back.

For that reason, while we would often recommend moving from dynamic URLs with a bunch of parameters in them to cleaner URLs, and may recommend moving from impenetrable URLs to more readable ones, it will generally be hard to justify a move from reasonably-good URLs to arguably-better. Do your own risk assessment, and proceed with caution!

How to create an optimized career page for your website

Posted by on Aug 25, 2018 in SEO Articles | Comments Off on How to create an optimized career page for your website

How to create an optimized career page for your website

With recruitment as competitive a market as it has ever been, it’s essential to ensure every careers page or job vacancy on your website is fully optimized in order to place it in front of the perfect candidate online.

They are some of the largest and most powerful websites around, but typically online job boards lack page authority, so while you cannot compete with them on a domain level, you can still outrank these huge companies with good SEO.

The next step is selling your vacancy to the candidate, which can sometimes be a tough process, but one that your job pages can definitely help you out with.

How should you go about doing this?

Conduct thorough keyword research

Your first port of call to ensure your careers pages are fully optimized is to conduct some thorough keyword research in order to identify relevant keywords to target on your job pages.

Location-specific job searches invariably have a favorable ratio between search volume and keyword difficulty (competitiveness), so it’s crucial to ensure you are targeting properly before you begin to optimize.

Ahrefs is my tool of choice for this due to the ease of use and array of filters available to use.

Use internal links

Internal links are your chance to tell Google which pages on your site are the most important. You can manage your internal links as you wish, but one recommended strategy is for any page you are trying to rank, you should point internal links at it from the more powerful pages on your website.

A good way of finding these authoritative pages is by using the ‘Top Pages’ category in ahrefs (other tools are available) which will filter your pages by URL rating (authority) in a descending order. You are left with a list of your most powerful pages ready to be linked from.

When trying to boost vacancy pages, adding natural looking anchor text along the lines of ‘Like what you are reading? See our latest job openings’ and linking to the live positions can work well.

Internal linking is an oft-underutilized strategy in SEO and Andy Drinkwater is one of the more prominent voices on the topic often sharing useful, actionable information with the SEO community.

Maximize your content

Ensure the copy featured on each of your careers pages is optimized to rank well. Your content should be specific to your company and the individual role, with a minimum word count of 250 words.

Make it enticing! And if your company has a personality, ensure you show it.

The copy itself should be relevant and informative to the user, answering any specific queries they may have. The more information you can give the prospect the better.

Avoid duplicate content at all costs and try to be creative – you can assume the job seeker is looking at a number of job posts so you really need to try and stand out here.

Go behind the scenes

Provide potential employees with a look behind the scenes at your offices before they apply for a role. This is likely to benefit both you and the prospective employee as they can see if the environment appeals to them.

An office walkthrough is the ideal way to show what life is like at your company, plus the tour footage can form part of your Google Business listing (if recorded by an accredited Google Business). Appearing alongside your company address and telephone number, it’s an effective way to boost your site’s local SEO.

If you really want to stand out from your competitors, however, why not invest in a 360 degree tour of your office? This can also be VR-based. Interactive and realistic, it’s the next best thing to being in your office in person and will help a prospective employee to really visualize working for you.

Once you’ve taken these factors into consideration, you also need to think about Google Jobs.

What is Google Jobs?

Having launched in the US in 2017 and the UK in July this year, Google’s new job search tool Google Jobs looks set to radically alter the way job seekers search for roles, also impacting recruitment agencies and their processes.

Google caused a disturbance in the flight industry with the launch of Google Flights, which saw an immediate uptake in bookings from customers who were frustrated by the tendency of airlines to withhold information about additional costs such as baggage fees whilst booking, in order to make their flights appear cheaper.  

Inc.com attributed the success of Google Flights to increased transparency to customers, who are able to see all the relevant costs prior to booking a flight, plus any predicted delays. The impact of the launch of Google Flights was immediate, with Business Insider stating the platform was “…an embarrassment to the airline industry”.

The search engine’s success in identifying and capitalizing upon weaknesses in the travel and tourism industry is expected to be replicated in the recruitment industry with the launch of Google Jobs.

Simply recognizing users’ frustration at a lack of information, collating results at once and then proceeding to provide this information immediately results in a more valuable service for users.

What does the launch of Google Jobs mean for job vacancies online?

Google Jobs has been designed to simplify and speed up the process of job-hunting for the job-seeker. At the US launch of Google Jobs last year, Google CEO Pichai Sundar announced that the purpose of the tool was to “better connect employers and job seekers”.

Users are able to filter roles by key criteria such as necessary qualifications and experience, working hours, salary and commute. Recruiters and employment platforms currently working with Google Jobs include LinkedIn, Monster, Glassdoor and Payscale (but interestingly, not Indeed).

The impact on recruitments companies will be severe. Even if you were ranking #1, you will now have the Google Jobs ‘import’ sitting above you plus the usual PPC ads.

While it’s safe to assume that Google will weight Google Jobs above all other recruitment platforms, it is worth bearing in mind that the company recently received a $5 billion fine from the EU for abusing their Android dominance, so they may – initially at least – proceed with more caution than usual.

What does this mean for my job vacancy?

Google Jobs pulls through vacancies from many recruitment company sites and jobs boards. Unfortunately, at the time of writing, there isn’t the ability to get your (a SMEs) specific role featured in the platform without posting it on one of these jobs sites/boards.

For businesses who have steered clear of these in the past, now may be the time to start to signing up.

We can assume Google Jobs’ popularity is only going to increase so if you want to maximize the chance of your vacancy being seen, don’t get left behind.

 

Ask Yoast: Can Google deal with Lazy Load?

Posted by on Aug 24, 2018 in SEO Articles | Comments Off on Ask Yoast: Can Google deal with Lazy Load?

Ask Yoast: Can Google deal with Lazy Load?

We’ve said it time and again: site speed is a crucial aspect of your SEO. That’s why we often write about site speed tools, speed optimization, and other things you need to know to make your site lightning fast. One factor in site speed is image optimization: on most sites, images will play a part in loading times. So, giving your image SEO some thought will pay off.

Want to bump your SEO to a higher level? Become a technical SEO expert with our Technical SEO training! »

$199 – Buy now » Info

Besides resizing and compressing your images to improve loading times, there’s the option to implement ‘lazy loading’ on your site. Lazy loading means that an image or object on your site doesn’t load until it appears in your visitor’s browser. For example: if a page has 8 images, only those that appear ‘above the fold’ load right away, while the others load as the user scrolls down. This can significantly improve speed, especially on pages that contain a lot of images. There are several plugins you can use to add lazy loading to your WordPress site. But is there really no catch? Will Google still index all your images?

MaAnna emailed us, wondering exactly that:

I’m testing the lazy load image function in WP Rocket. In online testers like WebPage Test, the waterfall doesn’t show the images loading, but when I do a Fetch and Render in Google Search Console all images on a page are shown. Can Google deal with lazy load and still index our images, as Fetch and Render seems to indicate?

Watch the video or read the transcript further down the page for my answer!

Can Google deal with Lazy Load?

“Yes, it can. It renders the page, it waits a bit and it scrolls down the page a bit to generate all the events that it needs to generate to make sure that it has loaded the entire page.

So yes, it can deal with that. You’re very fine using something like that lazy load image function. Google actually has code itself as well, in which it promotes the lazy loading of images because it really enhances people’s experience because pages get faster using lazy load. So, by all means, do use it. Use it well. Good luck!”

Ask Yoast

In the series Ask Yoast, we answer SEO questions from our readers. Do you have an SEO-related question? A pressing SEO dilemma to which you can’t find the answer? Send an email to [email protected], and your question may be featured in one of our weekly Ask Yoast vlogs.

Note: you may want to check our blog and knowledge base first, the answer to your question could already be out there! For urgent questions, for example about the Yoast SEO plugin not working properly, please contact us through our support page.

Read more: Does site speed influence SEO? »

The post Ask Yoast: Can Google deal with Lazy Load? appeared first on Yoast.