Blog

Structured Data Can = MehSEO

Posted by on Aug 30, 2018 in SEO Articles | Comments Off on Structured Data Can = MehSEO

Structured Data Can = MehSEO

In 2011, Google, Bing & Yahoo announced Schema.org which got SEOs all excited to start marking up website content to turn it into “structured data.” The benefit would be that search engines would be more certain that a text string of numbers was in fact a phone number, or at least they would be more certain that you wanted them to think it was phone number. The search engines could then turn the structured data into eye-catching fripperies designed to seduce searchers into surrendering their clicks and revenue to your fabulously marked-up site (aka “Rich Results).

It also could help your fridge talk to your Tesla.

So pretty much every SEO marked-up their audits and conference presentations with recommendations to mark up all the things. LSG was no exception. And we have seen it work some nice SEO miracles.

There was the ecommerce site that lost all its product review stars until we reconfigured the markup. There was the yellow pages site that got a spammy structured data manual action for merging a partner’s review feed into its own. There is the software vendor and its clients that (still!) violate Google’s structured data guidelines and get away with it. There have been countless Knowledge Panels that have needed the tweaking one can only get from a perfectly implemented https://schema.org/logo.

But structured data is not a killer SEO strategy for all situations, and it’s important that SEOs and clients understand that often it’s more of a future-proofing game than an actual near-term traffic or money-generator. For example, let’s take this UGC site that generated about 22 million clicks from Google over the past three months and see how many clicks are reported as coming from “Rich Results” in Google Search Console:

So less than one-half of one-half of 1% of clicks came from a “Rich Result.” Not particularly impressive.

The good news is that Google is in fact using the structured markup. We can see evidence of it in the SERPs. But it’s likely the content of this site doesn’t lend itself to eye-popping featured snippets. For example, many of the Rich Results appear to just be bolded words that appear in the URL snippets in the SERPs, kind of like this:

It also may just take time before Google trusts your markup.

So before you drop everything and prioritize structured markup, you may want to consult Google’s Structured Data Gallery to get an idea of which types of content Google is pushing to markup. You also should check the SERPs to see what your competitors are doing in this area and how their marked-up content is being displayed. This should give you a good idea of what the potential is for your site.

And remember,”you can mark-up anything, but you can’t mark-up everything…” – Tony Robbins?

The post Structured Data Can = MehSEO appeared first on Local SEO Guide.

Getting personal with SEO: how to use search behavior to transform your campaign

Posted by on Aug 29, 2018 in SEO Articles | Comments Off on Getting personal with SEO: how to use search behavior to transform your campaign

In order to meet the needs of today’s consumers and a more intelligent digital market, creating value in optimization campaigns requires innovative thinking and a personalized approach. Adverts, landing pages, and on-site messages that feel tailor-made are becoming the norm for many brands, contributing to higher response rates, visibility, and value.

Arguably, in today’s post-truth era, creating a personal message that can tap into the emotions and needs of a consumer is exactly the direction in which we will continue to progress. It’s also likely that in the near future, this will become the only way that optimization campaigns can be successful.

Anyone can enhance and deliver stronger campaigns by picking insights from search behaviors and using them to directly address your digital customers. But how can you maximize the effectiveness of doing this? Using Delete’s European Search Award-winning campaign for Leeds Beckett University as a case study, this article will take an in-depth look into profiling and understanding your browsers to attract and convert new customers.

Why utilizing user search behavior is necessary in campaigns

From Google’s personalized search algorithm that was launched in 2005, to 2015’s RankBrain, search results have consistently shifted towards searcher satisfaction rather than the needs of a webmaster or business. As users began to demand more intelligent, considered content (keyword stuffing is now a definitive no-go), we’ve had to adapt by creating engaging content that is authoritative in terms of knowledge and information.

There are clear signs that behavior signals are on Google’s radar. Google now elevates the results that it considers to be more relevant to a searcher based on profile information that it gathers about them. So, when it comes to creating your own outreach campaigns, it is only logical to harness and use this profile information to influence post-click user experience.

Harness search behavior to create customer profiles and develop positive relationships

Using search behavior information and user profiles is important because of the phenomenal results you can achieve, particularly at a time when advertising is becoming more challenging by the day.

Splitting users into customer profiles is a method that will enable the creation of targeted, tailor-made advertising and content that is more likely to result in conversions. There are a variety of ways that user behavior can be tracked and profiled, varying from more in-depth and specific methods to quicker, cheaper options that may benefit a brand looking to boost a current campaign or alter the way that their advertising is completed in-house. Not only will customer profiles ensure that only relevant content is delivered to users, but it can also contribute to the development of customer trust and loyalty.

Delete’s Leeds Beckett campaign saw the development of delivering tailor-made landing pages and adverts to international students in an aim to encourage verbal contact with the university as early in the cycle as possible and to make an easier, less daunting application process. By using geographical data, we were able to create customer profiles for international students, which then meant we were able to serve carefully selected imagery to visitors from China, India, and Europe, as well as clear and relevant calls to action.

Splitting apart potential customers by geography, interests, and type of content consumption on the site is the most efficient way to create customer profiles. It can be done through both organic searches and paid searches, with both outlets leading to different customer bases across a variety of platforms. Leveraging existing data is also a practical and simple solution that will help develop stronger relationships with a current customer base. You can then lead users to dynamic pages and imagery that are reflective of organic searches, geolocation, and paid advertising clicks.

The value in creating customer profiles from paid or organic searches

Advertisers now have to look for ways to outsmart the competition. Unfortunately, managing a campaign well is no longer anything special, but a default expectation. Try going beyond the boundaries of just “best practice” SEO or PPC and show real innovation and creativity; it will really pay off.

Using data from users’ organic searches enables a valuable customer profile of people who are already invested or interested in a brand. When it comes to applying this behavior to SEO, it results in the opportunity to tap into a receptive audience who will benefit from additional information and who may have abandoned conversion if they hadn’t been given access to the information that they were looking for.

Delete’s campaign with Leeds Beckett University experienced phenomenal results. For a typical budget for a campaign of its caliber, we were able to generate approximately £6.9 million revenue in one year and an ROI of 10,403.00%. The use of customer profiles undoubtedly played a large part in this.

Use geographical data to deliver direct and relevant information

In an aim to target potential customers and increase conversion, Delete used an innovative method of developing a live map that would plot the addresses of past enrollments, prospects gathered at educational fairs, and open day registrations. This completely changed their geographical targeting in all marketing campaigns, resulting in a 691.67% increase in traffic to the clearing section.

By creating customer profiles based on geography, there is the opportunity to attract and cater to people who may have less initial interest as well as reduce abandoned conversions due to unrelated content. As well as this, it can encourage behaviors that are natural and reflective of the user with a lower cost per click and a higher volume of leads.

Revolutionize the way you use paid and organic search behavior for remarkable results

To maximize results in a marketing campaign, create dynamic landing pages and website experience based on recorded search behaviors and the profiles that can be subsequently created using this information. When it comes to paid ads, you can pass targeting and settings to a website and use this information to personalize the website.

With organic listings, you can glean user interests from entrance pages from organic search and what users do once they are on a page. If you create your landing pages right, so that they target the desired keywords well, you can also make assumptions from people landing on these pages from organic search and then interact with them in whichever way you want, even targeting certain interests.

For example, in our campaign with Leeds Beckett, if a user indicated an interest in a Civil Engineering degree (by clicking on a PPC ad from Civil Engineering for Undergraduates ad group), the landing page or the whole website would start surfacing an image of a work placement student standing on a building site, wearing a hard hat and high visibility jacket. This brings the individual student’s interests to the surface, highlighting the best relevant features that the university has on offer. Ultimately the aim here is to shorten the user journey and increase the chance of a conversion.

This can be applied to almost any marketing area or industry, and it will transform the way that your users are able to engage with your content.

The Long-Term Link Acquisition Value of Content Marketing

Posted by on Aug 29, 2018 in SEO Articles | Comments Off on The Long-Term Link Acquisition Value of Content Marketing

The Long-Term Link Acquisition Value of Content Marketing

Posted by KristinTynski

Recently, new internal analysis of our work here at Fractl has yielded a fascinating finding:

Content marketing that generates mainstream press is likely 2X as effective as originally thought. Additionally, the long-term ROI is potentially many times higher than previously reported.

I’ll caveat that by saying this applies only to content that can generate mainstream press attention. At Fractl, this is our primary focus as a content marketing agency. Our team, our process, and our research are all structured around figuring out ways to maximize the newsworthiness and promotional success of the content we create on behalf of our clients.

Though data-driven content marketing paired with digital PR is on the rise, there is still a general lack of understanding around the long-term value of any individual content execution. In this exploration, we sought to answer the question: What link value does a successful campaign drive over the long term? What we found was surprising and strongly reiterated our conviction that this style of data-driven content and digital PR yields some of the highest possible ROI for link building and SEO.

To better understand this full value, we wanted to look at the long-term accumulation of the two types of links on which we report:

Direct links from publishers to our client’s content on their domain
Secondary links that link to the story the publisher wrote about our client’s content

While direct links are most important, secondary links often provide significant additional pass-through authority and can often be reclaimed through additional outreach and converted into direct do-follow links (something we have a team dedicated to doing at Fractl).

Below is a visualization of the way our content promotion process works:

So how exactly do direct links and secondary links accumulate over time?

To understand this, we did a full audit of four successful campaigns from 2015 and 2016 through today. Having a few years of aggregation gave us an initial benchmark for how links accumulate over time for general interest content that is relatively evergreen.

We profiled four campaigns:

Perceptions of Perfection Across Borders
America’s Most P.C. and Prejudiced Places
Reverse-Photoshopping Video Game Characters
Water Bottle Germs Revealed

The first view we looked at was direct links, or links pointing directly to the client blog posts hosting the content we’ve created on their behalf.

There is a good deal of variability between campaigns, but we see a few interesting general trends that show up in all of the examples in the rest of this article:

Both direct and secondary links will accumulate in a few predictable ways:
A large initial spike with a smooth decline
A buildup to a large spike with a smooth decline
Multiple spikes of varying size

Roughly 50% of the total volume of links that will be built will accumulate in the first 30 days. The other 50% will accumulate over the following two years and beyond.
A small subset of direct links will generate their own large spikes of secondary links.

We’ll now take a look at some specific results. Let’s start by looking at direct links (pickups that link directly back to our client’s site or landing page).

The typical result: A large initial spike with consistent accumulation over time

This campaign, featuring artistic imaginings of what bodies in video games might look like with normal BMI/body sizes, shows the most typical pattern we witnessed, with a very large initial spike and a relatively smooth decline in link acquisition over the first month.

After the first month, long-term new direct link acquisition continued for more than two years (and is still going today!).

The less common result: Slow draw up to a major spike

In this example, you can see that sometimes it takes a few days or even weeks to see the initial pickup spike and subsequent primary syndication. In the case of this campaign, we saw a slow buildup to the pinnacle at about a week from the first pickup (exclusive), with a gradual decline over the following two weeks.

“These initial stories were then used as fodder or inspiration for stories written months later by other publications.”

Zooming out to a month-over-month view, we can see resurgences in pickups happening at unpredictable intervals every few months or so. These spikes continued up until today with relative consistency. This happened as some of the stories written during the initial spike began to rank well in Google. These initial stories were then used as fodder or inspiration for stories written months later by other publications. For evergreen topics such as body image (as was the case in this campaign), you will also see writers and editors cycle in and out of writing about these topics as they trend in the public zeitgeist, leading to these unpredictable yet very welcomed resurgences in new links.

Least common result: Multiple spikes in the first few weeks

The third pattern we observed was seen on a campaign we executed examining hate speech on Twitter. In this case, we saw multiple spikes during this early period, corresponding to syndications on other mainstream publications that then sparked their own downstream syndications and individual virality.

Zooming out, we saw a similar result as the other examples, with multiple smaller spikes more within the first year and less frequently in the following two years. Each of these bumps is associated with the story resurfacing organically on new publications (usually a writer stumbling on coverage of the content during the initial phase of popularity).

Long-term resurgences

Finally, in our fourth example that looked at germs on water bottles, we saw a fascinating phenomenon happen beyond the first month where there was a very significant secondary spike.

This spike represents syndication across (all or most) of the iHeartRadio network. As this example demonstrates, it isn’t wholly unusual to see large-scale networks pick up content even a year or later that rival or even exceed the initial month’s result.

Aggregate trends
“50% of the total links acquired happened in the first month, and the other 50% were acquired in the following two to three years.”

When we looked at direct links back to all four campaigns together, we saw the common progression of link acquisition over time. The chart below shows the distribution of new links acquired over two years. We saw a pretty classic long tail distribution here, where 50% of the total links acquired happened in the first month, and the other 50% were acquired in the following two to three years.

“If direct links are the cake, secondary links are the icing, and both accumulate substantially over time.”

Links generated directly to the blog posts/landing pages of the content we’ve created on our clients’ behalf are only really a part of the story. When a campaign garners mainstream press attention, the press stories can often go mildly viral, generating large numbers of syndications and links to these stories themselves. We track these secondary links and reach out to the writers of these stories to try and get link attributions to the primary source (our clients’ blog posts or landing pages where the story/study/content lives).

These types of links also follow a similar pattern over time to direct links. Below are the publishing dates of these secondary links as they were found over time. Their over-time distribution follows the same pattern, with 50% of results being realized within the first month and the following 50% of the value coming over the next two to three years.

The value in the long tail

By looking at multi-year direct and secondary links built to successful content marketing campaigns, it becomes apparent that the total number of links acquired during the first month is really only about half the story.

For campaigns that garner initial mainstream pickups, there is often a multi-year long tail of links that are built organically without any additional or future promotions work beyond the first month. While this long-term value is not something we report on or charge our clients for explicitly, it is extremely important to understand as a part of a larger calculus when trying to decide if doing content marketing with the goal of press acquisition is right for your needs.

Cost-per-link (a typical way to measure ROI of such campaigns) will halve if links built are measured over these longer periods — moving a project you perhaps considered a marginal success at one month to a major success at one year.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

A Quarter-Million Reasons to Use Moz’s Link Intersect Tool

Posted by on Aug 29, 2018 in SEO Articles | Comments Off on A Quarter-Million Reasons to Use Moz’s Link Intersect Tool

A Quarter-Million Reasons to Use Moz’s Link Intersect Tool

Posted by rjonesx.

Let me tell you a story.

It begins with me in a hotel room halfway across the country, trying to figure out how I’m going to land a contract from a fantastic new lead, worth annually $250,000. We weren’t in over our heads by any measure, but the potential client was definitely looking at what most would call “enterprise” solutions and we weren’t exactly “enterprise.”

Could we meet their needs? Hell yes we could — better than our enterprise competitors — but there’s a saying that “no one ever got fired for hiring IBM”; in other words, it’s always safe to go with the big guys. We weren’t an IBM, so I knew that by reputation alone we were in trouble. The RFP was dense, but like most SEO gigs, there wasn’t much in the way of opportunity to really differentiate ourselves from our competitors. It would be another “anything they can do, we can do better” meeting where we grasp for reasons why we were better. In an industry where so many of our best clients require NDAs that prevent us from producing really good case studies, how could I prove we were up to the task?

In less than 12 hours we would be meeting with the potential client and I needed to prove to them that we could do something that our competitors couldn’t. In the world of SEO, link building is street cred. Nothing gets the attention of a client faster than a great link. I knew what I needed to do. I needed to land a killer backlink, completely white-hat, with no new content strategy, no budget, and no time. I needed to walk in the door with more than just a proposal — I needed to walk in the door with proof.

I’ve been around the block a few times when it comes to link building, so I wasn’t at a loss when it came to ideas or strategies we could pitch, but what strategy might actually land a link in the next few hours? I started running prospecting software left and right — all the tools of the trade I had at my disposal — but imagine my surprise when the perfect opportunity popped up right in little old Moz’s Open Site Explorer Link Intersect tool. To be honest, I hadn’t used the tool in ages. We had built our own prospecting software on APIs, but the perfect link just popped up after adding in a few of their competitors on the off chance that there might be an opportunity or two.

There it was:

3,800 root linking domains to the page itself
The page was soliciting submissions
Took pull requests for submissions on GitHub!

I immediately submitted a request and began the refresh game, hoping the repo was being actively monitored. By the next morning, we had ourselves a link! Not just any link, but despite the client having over 50,000 root linking domains, this was now the 15th best link to their site. You can imagine me anxiously awaiting the part of the meeting where we discussed the various reasons why our services were superior to that of our competitors, and then proceeded to demonstrate that superiority with an amazing white-hat backlink acquired just hours before.

The quarter-million-dollar contract was ours.

Link Intersect: An undervalued link building technique

Backlink intersect is one of the oldest link building techniques in our industry. The methodology is simple. Take a list of your competitors and identify the backlinks pointing to their sites. Compare those lists to find pages that overlap. Pages which link to two or more of your competitors are potentially resource pages that would be interested in linking to your site as well. You then examine these sites and do outreach to determine which ones are worth contacting to try and get a backlink.

Let’s walk through a simple example using Moz’s Link Intersect tool.

Getting started

We start on the Link Intersect page of Moz’s new Link Explorer. While we had Link Intersect in the old Open Site Explorer, you’re going to to want to use our new Link Intersect, which is built from our giant index of 30 trillion links and is far more powerful.

For our example here, I’ve chosen a random gardening company in Durham, North Carolina called Garden Environments. The website has a Domain Authority of 17 with 38 root linking domains.

We can go ahead and copy-paste the domain into “Discover Link Opportunities for this URL” at the top of the Link Intersect page. If you notice, we have the choice of “Root Domain, Subdomain, or Exact Page”:

I almost always choose “root domain” because I tend to be promoting a site as a whole and am not interested in acquiring links to pages on the site from other sites that already link somewhere else on the site. That is to say, by choosing “root domain,” any site that links to any page on your site will be excluded from the prospecting list. Of course, this might not be right for your situation. If you have a hosted blog on a subdomain or a hosted page on a site, you will want to choose subdomain or exact page to make sure you rule out the right backlinks.

You also have the ability to choose whether we report back to you root linking domains or Backlinks. This is really important and I’ll explain why.

Depending on your link building campaign, you’ll want to vary your choice here. Let’s say you’re looking for resource pages that you can list your website on. If that’s the case, you will want to choose “pages.” The Link Intersect tool will then prioritize pages that have links to multiple competitors on them, which are likely to be resource pages you can target for your campaign. Now, let’s say you would rather find publishers that talk about your competitors and are less concerned about them linking from the same page. You want to find sites that have linked to multiple competitors, not pages. In that case, you would choose “domains.” The system will then return the domains that have links to multiple competitors and give you example pages, but you wont be limited only to pages with multiple competitors on them.

In this example, I’m looking for resource pages, so I chose “pages” rather than domains.

Choosing your competitor sites

A common mistake made at this point is to choose exact competitors. Link builders will often copy and paste a list of their biggest competitors and cross their fingers for decent results. What you really want are the best link pages and domains in your industry — not necessarily your competitors.

In this example I chose the gardening page on a local university, a few North Carolina gardening and wildflower associations, and a popular page that lists nurseries. Notice that you can choose subdomain, domain, or exact page as well for each of these competitor URLs. I recommend choosing the broadest category (domain being broadest, exact page being narrowest) that is relevant to your industry. If the whole site is relevant, go ahead and choose “domain.”

Analyzing your results

The results returned will prioritize pages that link to multiple competitors and have a high Domain Authority. Unlike some of our competitors’ tools, if you put in a competitor that doesn’t have many backlinks, it won’t cause the whole report to fail. We list all the intersections of links, starting with the most and narrowing down to the fewest. Even though the nurseries website doesn’t provide any intersections, we still get back great results!

Now we have some really great opportunities, but at this point you have two choices. If you really prefer, you can just export the opportunities to CSV like any other tool on the market, but I prefer to go ahead and move everything over into a Link Tracking List.

By moving everything into a link list, we’re going to be able to track link acquisition over time (once we begin reaching out to these sites for backlinks) and we can also sort by other metrics, leave notes, and easily remove opportunities that don’t look fruitful.

What did we find?

Remember, we started off with a site that has barely any links, but we turned up dozens of easy opportunities for link acquisition. We turned up a simple resources page on forest resources, a potential backlink which could easily be earned via a piece of content on forest stewardship.

We turned up a great resource page on how to maintain healthy soil and yards on a town government website. A simple guide covering the same topics here could easily earn a link from this resource page on an important website.

These were just two examples of easy link targets. From community gardening pages, websites dedicated to local creek, pond, and stream restoration, and general enthusiast sites, the Link Intersect tool turned up simple backlink gold. What is most interesting to me, though, was that these resource pages never included the words “resources” or “links” in the URLs. Common prospecting techniques would have just missed these opportunities altogether.

While it wasn’t the focus of this particular campaign, I did choose the alternate of “show domains” rather than “pages” that link to the competitors. We found similarly useful results using this methodology.

For example, we found CarolinaCountry.com had linked to multiple of the competitor sites and, as it turns out, would be a perfect publication to pitch for a story as part of of a PR campaign for promoting the gardening site.

Takeaways

The new Link Intersect tool in Moz’s Link Explorer combines the power of our new incredible link index with the complete features of a link prospecting tool. Competitor link intersect remains one of the most straightforward methods for finding link opportunities and landing great backlinks, and Moz’s new tool coupled with Link Lists makes it easier than ever. Go ahead and give it a run yourself — you might just find the exact link you need right when you need it.

Find link opportunities now!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Faceted Navigation and SEO: A Deeper Look

Posted by on Aug 29, 2018 in SEO Articles | Comments Off on Faceted Navigation and SEO: A Deeper Look

Faceted Navigation and SEO: A Deeper Look

The complex web of factors that determine page counts for a site with faceted navigation. It’s about the SEO, folks

tl;dr: Skip to each “Takeaways” section if you want a few ideas for handling faceted navigation and SEO. But do so at your own risk. The “why” is as important as the “what.”

If you have ever shopped for anything online, you’ve seen faceted navigation. This is the list of clickable options, usually in the left panel, that can be used to filter results by brand, price, color, etc. Faceted navigation makes it possible to mix & match options in any combination the user wishes. It’s popular on large online stores because it allows the user to precisely drill down to only the things they are interested in.

An example of faceted navigation

But this can cause huge problems for search engines because it generates billions of useless near-duplicate pages. This wastes crawl budget, lowers the chances that all of the real content will get indexed, and it gives the search engines the message that the site is mostly low-quality junk pages (because, at this point, it is).

Many articles talk about faceted navigation and how to mitigate the SEO problems that it causes. Those are reactive strategies: How to prevent the search engines from crawling and indexing the billions of pages your faceted navigation created.

This is not one of those how-to articles.

Instead, it’s about the decisions that create massive duplication and how to avoid them from the start. It’s about the seemingly innocuous UX choices and their unintended consequences. My goal is to give you a deeper understanding of how each decision affects crawlability and final page counts. I’m hoping this will give you knowledge you can use, both to avoid problems before they start and to mitigate problems that can’t be avoided.

Match Types and Grouping

Faceted navigation is typically divided into groups, with a list of clickable options in each group. There might be one group for brand names, another for sizes, another for colors, etc. The options in a group can be combined in any of a few different ways:

“AND” matching — With this match type, the store only shows an item if it matches all of the selected options. “AND” matching is most often used for product features where it is assumed the shopper is looking for a specific combination of features, and is only interested in a product if it has all of them. (e.g., headphones that are both wireless and noise-canceling)
“OR” matching — With this match type, the store shows items that match any of the selected options. This can be used for lists of brand names, sizes, colors, price ranges, and many other things. The assumption here is that the user is interested in a few different things, and wants to see a combined list that includes all of them. (e.g., all ski hats available in red, pink or yellow).
“Radio button” matching — With this match type, only one option may be selected at a time. Selecting one option deselects all others. The assumption here is that the options are 100% mutually exclusive, and nobody would be interested in seeing more than one of them at a time. Radio buttons are often used to set sort order. It is also sometimes used to choose between mutually exclusive categories. (e.g., specifying the smartphone brand/model when shopping for phone cases) Some radio button implementations require at least one selected option (e.g., for sort order), and others don’t (e.g., for categories).

The options within a given group can be combined using any one of these match types, but the groups themselves are almost always combined with each other using “AND” matching. For example, if you select red and green from the “colors” group, and you select XL and XXL from the “sizes” group, then you will get a list of every item that is both one of those two colors and one of those two sizes.

A typical real-world website will have several groups using different match types, with many options between them. The total number of combinations can get quite large:

The above example has just over 17 billion possible combinations. Note that the total number of actual pages will be much larger than this because the results from some combinations will be split across many pages.

For faceted navigation, page counts are ultimately determined by three main things:

The total number of possible combinations of options — In the simplest case (with only “AND” & “OR” matching, and no blocking) the number of combinations will be 2n, where n is the number of options. For example, if you have 12 options, then there will be 212, or 4,096 possible combinations. This gets a bit more complicated when some of the groups are radio buttons, and it gets a lot more complicated when you start blocking things.
The number of matching items found for a given combination — The number of matching items is determined by many factors, including match type, the total number of products, the fraction of products matched by each filter option, and the amount of overlap between options.
The maximum number of items to be displayed per page — This is an arbitrary choice set by the site designer. You can set this to any number you want. A bigger number means fewer pages but more clutter on each of them.

 

Test: How Does Match Type Affect Page Counts?

The choice of match type affects the page count by influencing both the number of combinations of options and also the number of matching items per combination.

How were these results calculated?
All of the numeric results in this article were generated by a simulation script written for this purpose. This script works by modeling the site as a multi-dimensional histogram, which is then repeatedly scaled and re-combined with itself each time a new faceted nav option is added to the simulated site. The script simulates gigantic sites with many groups of different option types relatively quickly. (For previous articles, I have always generated crawl data using an actual crawler, running on a test website made up of real HTML pages. That works fine when there are a few tens of thousands of pages, but some of the tests for this article have trillions of pages. That would take my crawler longer than all of recorded human history to crawl. Civilizations rise and fall over centuries. I decided not to wait that long.)

Test #1 — Simple “AND” Matching

Suppose we have a site with the following properties:

The faceted nav consists of one big group, with 32 filtering options that can be selected in any combination.
There are 10,000 products.
On average, each filtering option matches 20% of the products.
The site displays (up to) 10 products per page.
Options are combined using “AND” matching.

The above assumptions give you a site with:

4,294,967,296 different combinations of options
4,295,064,687 pages.
4,294,724,471 empty results.

The obvious: The number of pages is enormous, and the vast majority of them are empty results. For every 12,625 pages on this site, one shows actual products. The rest show the aggravating “Zero items found” message. This is a terrible user experience and a colossal waste of crawl budget. But it’s also an opportunity.

So what can we do about all those empty results? If you are in control of the server side code, you can remove them. Any option that would lead to a page that says “Zero items found” should either be grayed out (and no longer coded as a link) or, better yet, removed entirely. This needs to be evaluated on the server side each time a new page is requested. If this is done correctly, then each time the user clicks on another option, all of the remaining options that would have led to an empty result will disappear. This reduces the number of pages, and it also dramatically improves the user experience. The user no longer has to stumble through a maze of mostly dead ends to find the rare combinations that show products.

So let’s try this.

Test #2 — “AND” Matching, With Empty Results Removed

This test is identical to Test #1, except now all links that lead to empty results are silently removed.

This time, we get:

1,149,017 (reachable) combinations of options.
1,246,408 pages.
0 empty results. (obviously, because we’ve removed them)

This may still seem like a lot, but it’s a significant improvement over the previous test. The page count has gone from billions down to just over one million. This is also a much better experience for the users, as they will no longer see any useless options that return zero results. Any site that has faceted nav should be doing this by default.

Test #3 — “OR” Matching

This test uses the same parameters as Test #1, except it uses “OR” matching:

The faceted nav still has 32 filtering options
There are still 10,000 products.
Each filtering option still matches 20% of the products.
The site still displays 10 products per page.
Options are now combined using “OR” matching instead of “AND” matching.

This gives us:

4,294,967,296 different combinations of options.
4,148,637,734,396 pages (!)
0 empty results.

The number of combinations is precisely the same, but the number of pages is much higher now (966 times higher), and there are no longer any empty results. Why is the page count so high? Because, with “OR” matching, every time you click on a new option the number of matching items increases. This is the opposite of “AND” matching, where the number decreases. In this test, most combinations now include almost all of the products on the site. In Test #1, most combinations produced empty results.

There are no empty results at all in this new site. The only way there could be an empty result would be if you chose to include a filtering option that never matches anything (which would be kind of pointless). The strategy of blocking empty results does not affect this match type.

Test #4 — Radio Buttons

This test uses radio button matching.

If we repeat Test #1, but with radio button matching, we get:

33 different combinations of options.
7,400 pages.
0 empty results.

This is outrageously more efficient than any of the others. The downside of radio button matching is that it’s much more restrictive in terms of user choice.

The takeaway: Always at least consider using radio button matching when you can get away with it (any time the options are mutually exclusive). It will have a dramatic effect on page counts.

Recap of Tests #1–4:

Test
Configuration
Page count

1
“AND” matching (without blocking empty results)
4,295,064,687

2
“AND” matching, with empty results blocked
1,246,408

3
“OR” matching
4,148,637,734,396

4
Radio buttons
7,400

Takeaways

The choice of match type is important and profoundly impacts page counts.
“OR” matching can lead to extremely high page counts.
“AND” matching isn’t as bad, provided you are blocking empty results.
You should always block empty results.
Blocking empty results helps with “AND” matching, but doesn’t affect “OR” matching.
Always use radio buttons when the options are mutually exclusive.

How Grouping Affects Page Count

So far, we have looked at page counts for sites that have one big group of options with the same match type. That’s unrealistic. On a real website, there will usually be many groups with different match types. The exact way the options are separated into groups is another factor that can affect page counts.

Test #5 — “OR” Matching, Split Into Multiple Groups

Let’s take the original parameters from Test #3:

The faceted nav has a total of 32 filtering options.
There are 10,000 products.
On average, each filtering option matches 20% of the products.
The site displays up to 10 products per page.
Options are combined using “OR” matching.

But this time, we’ll redo the test several times, and each time, we’ll split the 32 options into a different number of groups.

This gives us:

Configuration
Pages
Empty Results

1 group with 32 options
4,148,637,734,396
0

2 groups with 16 options per group
2,852,936,777,269
0

4 groups with 8 options per group
466,469,159,950
0

8 groups with 4 options per group
5,969,194,867
290,250,752

16 groups with 2 options per group
4,296,247,759
4,275,284,621

The interesting thing here is that the last two tests have some empty results. Yes, all groups used “OR” matching, and yes, I told you “OR” matching does not produce empty results. So what’s going on here? Remember, no matter which match types are used within each group, the groups are combined with each other using “AND” matching. So, if you break an “OR” group into many smaller “OR” groups, you get behavior closer to an “AND” group.

Another way to put it: Suppose there are eight groups with four options each, and the user has selected exactly one option from each group. For any item to show up in those results, the item would have to match all eight of those selected options. This is functionally identical to what you would get if those eight selected options were part of an “AND” group.

If you are blocking empty results (which you should be doing anyway), then the actual page counts for the last two tests will be much smaller than is shown in this table. Before you get all excited, note that you have to have quite a few groups before this starts happening. It’s possible some site might be in a market where it makes sense to have eight groups with four options each, but it isn’t something that will happen often.

The boring but more practical observation is that even breaking the group into two parts reduces the page count noticeably. The difference isn’t huge, but it’s enough to be of some value. If a group of options that uses “OR” matching can be logically separated into two or more smaller groups, then it may be worth doing.

Test #6 — “AND” Matching, Split Into Multiple Groups

(I’m including this test because, if I don’t, people will tell me I forgot to do this one)

This test is the same as Test #5, but with “AND” matching instead of “OR” matching (and empty results are now being blocked).

Configuration
Pages

1 group with 32 options
1,246,408

2 groups with 16 options per group
1,246,408

4 groups with 8 options per group
1,246,408

8 groups with 4 options per group
1,246,408

16 groups with 2 options per group
1,246,408

Yep. They all have the same number of pages. How can this be? The options within each group use “AND” matching, and groups are combined with each other using “AND” matching, so it doesn’t matter if you have one group or several. They are functionally identical.

Takeaway

If you want to split up an “AND” group because you think it will make sense to the user or will look nicer on the page, then go for it, but it will not affect page counts.

Other Things that Affect Page Counts
Test #7 — Changing “Items per Page”

This test uses the following parameters:

The faceted nav consists of five groups, with varying option counts, and a mix of different match types.
There are 10,000 products.
On average, each filtering option matches 20% of the products.
Links to empty results are blocked.

The test was repeated with different values for “Items per Page.”

This gives us:

Configuration
Page Count

10 items per page
18,690,151,025

32 items per page
10,808,363,135

100 items per page
8,800,911,375

320 items per page
8,309,933,890

1,000 items per page
8,211,780,310

This makes a difference when the values are small, but the effect tapers off as the values gets larger.

Test #8 — Adding a Pagination Limit

Some sites, especially some very large online stores, try to reduce database load by setting a “pagination limit.” This is an arbitrary upper limit to the number of pages that can be returned for a given set of results.

For example, if a given filter combination matches 512,000 products, and the site is set up to show 10 products per page, this particular combination would normally create 51,200 pages. Some sites set an arbitrary limit of, say, 100. If the user clicks all the way to page 100, there is no link to continue further.

These sites do this because, compared to delivering pages at the start of a pagination structure, delivering pages deeper in a pagination structure create a massive load on the database (for technical reasons beyond the scope of this article). The larger the site, the greater the load, so the largest sites have to set the arbitrary limit.

This test uses the following parameters:

The faceted nav consists of five groups, with varying option counts, and a mix of different match types.
There are 500,000 products.
On average, each filtering option matches 20% of the products.
Links to empty results are blocked.

The test was repeated with different values for the pagination limit.

This gives us:

Pagination Limit
Total Page Count

5
12,079,937,370

10
13,883,272,770

20
15,312,606,795

40
16,723,058,170

80
17,680,426,670

160
18,252,882,040

(no limit)
18,690,151,025

That’s definitely an improvement, but it’s underwhelming. If you cut the pagination limit in half, you don’t wind up with half as many pages. It’s more in the neighborhood of 90% as many. But this improvement is free because this type of limit is usually added for reasons other than SEO.

Pagination Takeaways

Test 7:

For lower values, changing “Items per Page” improves page counts by a noticeable amount.
When the values get higher, the effect tapers off. This is happening because most of the results now fit on one page. (and the page count can’t get lower than one)

Test 8:

If you have a huge site implementing a pagination limit primarily for database performance reasons, you may see a minor SEO benefit as a free bonus.
If you’re not also doing this to reduce database load, it’s not worth it.

Selectively Blocking Crawlers

All of the tests so far let the crawler see all of the human-accessible pages. Now let’s look at strategies that work by blocking pages via robots meta, robots.txt, etc.

Before we do that, we need to be clear about what “page count” really means. There are actually three different “page counts” that matter here:

Human-readable page count — Pages that can be viewed by a human being with a browser.
Crawlable page count — Pages that a search engine crawler is allowed to request.
Indexable page count — The number of pages that the search engine is allowed to index, and to potentially show in search results.

The crawlable page count is important because it determines how much crawl budget is wasted. This will affect how thoroughly and how frequently the real content on the site gets crawled. The indexable page count is important because it effectively determines how many thin, near-duplicate pages the search engines will try to index. This is likely to affect the rankings of the real pages on the site.

Test #9 — Selection Limit via Robots Meta with “noindex, nofollow”

In this test, if the number of selected options on the page gets above a pre-specified limit, then <meta name="robots" content="noindex,nofollow"> will be inserted into the HTML. This tells the search engines not to index the page or follow any links from it.

This test uses the following parameters:

The faceted nav consists of five groups, with varying option counts, and a mix of different match types.
There are 10,000 products.
On average, each filtering option matches 20% of the products.
Links to empty results are blocked.

For this test, the “selection limit” is varied from 0 to 5. Any page where the number of selected options is larger than this selection limit will be blocked, via robots meta tag with noindex, nofollow.

selection limit
crawlable pages
indexable pages

0
11,400
1,000

1
79,640
11,400

2
470,760
79,640

3
2,282,155
470,760

4
9,269,631
2,282,155

5
32,304,462
9,269,631

(no limit)
18,690,151,025
18,690,151,025

In these results, both indexable and crawlable page counts are reduced dramatically, but the number of crawlable pages is reduced by much less. Why? Because a robots meta tag is part of the HTML code of the page it is blocking. That means the crawler has to load the page in order to find out it has been blocked. A robots meta tag can block indexing, but can’t can’t block crawling. It still wastes crawl budget.

You might well ask: If robots meta can’t directly block a page from being crawled, then why is the crawlable page count reduced at all? Because crawlers can no longer reach the deepest pages: The pages that link to those pages are no longer followed or indexed. Robots meta can’t directly block crawling of a particular page, but it can block the page indirectly, by setting “nofollow” for all of the pages that link to it.

Test #10 — Repeat of Test #9, But With “noindex, follow”

This a repeat of test #9, except now the pages are blocked by a robots meta tag with “noindex, follow” instead of “noindex, nofollow.” This tells the crawler that it still shouldn’t index the page, but it is OK to follow the links from it.

(I’m only including this one because, if I don’t, someone is bound to tell me I forgot to include it.)

selection limit
crawlable pages
indexable pages

0
18,690,151,025
1,000

1
18,690,151,025
11,400

2
18,690,151,025
79,640

3
18,690,151,025
470,760

4
18,690,151,025
2,282,155

5
18,690,151,025
9,269,631

(no limit)
18,690,151,025
18,690,151,025

This scheme reduces the number of indexable pages, but it does nothing whatsoever to prevent wasted crawl budget. Wasted crawl budget is the main problem that needs to be solved here, so this makes this scheme useless. There are some use cases (unrelated to faceted nav) where “noindex, follow” is a good choice, but this isn’t one of them.

Can the selection limit be implemented with robots.txt?

As shown in test #9, using robots meta tags to implement a selection limit is not ideal, because robots meta tags are part of the HTML of the page. The crawler has to load each page before it can find out if the page is blocked. This wastes crawl budget.

So what about using robots.txt instead? Robots.txt seems like a better choice for this, because it blocks pages from being crawled, unlike robots meta, which blocks pages from being indexed and/or followed. But can robots.txt be used to selectively block pages based on how many options they have selected? The answer is: it depends.

This depends on the URL structure. In some cases it’s simple, in others it’s difficult or impossible.

For example, if the URL structure uses some completely impenetrable format like base-64-encoded JSON:

https://example.com/products?p=WzczLCA5NCwgMTkxLCAxOThd

Then you are out of luck. You cannot use robots.txt to filter this, because there’s no way for robots.txt to tell how many selected options there are. You’ll have to use robots meta or X-Robots. (both of which can be generated by the server-side code, which has access to the decoded version of the query data)

On the other hand, if all filter options are specified as a single underscore-separated list of ID numbers in the query string, like this:

https://example.com/products?filters=73_94_191_198

Then you can easily block all pages that have more than (for example) two options selected, by doing this:

User-agent: *
Disallow: /products?*filters=*_*_

So let’s try this.

Test #11 — Selection Limit, via Robots.txt

This is a repeat of test #9, except now the pages are blocked using robots.txt instead of robots meta.

selection limit
crawlable pages
indexable pages

0
1,000
1,000

1
11,400
11,400

2
79,640
79,640

3
470,760
470,760

4
2,282,155
2,282,155

5
9,269,631
9,269,631

(no limit)
18,690,151,025
18,690,151,025

Takeaways

Blocking pages based on a selection limit is a very effective way to reduce page counts.
Implementing this with robots.txt is best.
But you can only use robots.txt if the URL structure allows it.
Implementing this with robots meta is less effective than robots.txt, but still useful.

Summary

Faceted navigation is one of the thorniest SEO challenges large sites face. Don’t wait to address issues after you’ve built your site. Plan ahead. Use robots.txt, look at selection options, and “think” like a search engine.

A little planning can improve use of crawl budget, boost SEO, and improve the user experience.

The post Faceted Navigation and SEO: A Deeper Look appeared first on Portent.

How to scale content production to capture the long-tail opportunity

Posted by on Aug 28, 2018 in SEO Articles | Comments Off on How to scale content production to capture the long-tail opportunity

Here’s something we all know so well that nobody needs to say it anymore: content is king.

We know it because we’ve been hit over the head with the phrase more times than you can shake a page view at. There’s no getting away from it: producing high-quality, engaging content and unique copy is vital for SEO, brand awareness, and affinity.

There will be few digital marketers out there who are not painfully aware of the challenge. When resources, time, and money are (more likely than not) limiting factors, how do you produce large amounts of content to a high enough standard to be effective?

This can be especially true if you or your client is a business with many different product lines, or in multiple locations around the world. The potential topics are infinite, red tape acts as a bottleneck, and copywriters can be overworked and expensive.

The good news is that with the rising popularity of remote working and digital nomads, partnered with a solid strategy and process, you don’t have to make the impossible choice of quality or quantity.

Use a network of freelancers

Perhaps you have a short-term project in the pipeline, or your client suddenly wants to dramatically increase the amount of content in production. What do you do? Hiring a team of copywriters is expensive.

The freelance market, however, is competitive, and these days you don’t have to compromise quality for the sake of cost. Digital nomads are highly-skilled, maybe even multi-lingual, and are likely to be based in countries where the cost of living is low.

Of course, this might not work for you if you need writers based in your market, in which case you could use your international freelancers for other means. Have you got a killer strategist on your books, or someone who speaks four languages who could translate and localize your copy using their knowledge of your markets? Make use of their skills.

It goes without saying that good communication is central to making it work with freelancers. Make yourself as available as possible to your writers and remind them again and again that there is no such thing as a silly question. Building a personal rapport is vital—video calls are great for this, and often far quicker than trying to painfully explain something over email. Apps such as Google Hangouts will become your best friend, for when a simple question requires a quick answer.

With freelancers you have the opportunity to not only become more cost-effective, but to make time zones work for you. This is the key: whilst you’re sleeping, some of your freelancers will be working. Manage this effectively and the amount you produce will rapidly increase, without compromising on quality.

Establish a process

It sounds absurdly simple, but if you don’t set up a clear, defined process, then you’re at very real risk of not achieving the core goals of the project. Common pitfalls include repeating work (or producing the wrong content due to poor briefs), missing deadlines, and inefficiently handling budgets.

It may take some time to set up, but it will undoubtedly pay off once it’s up, running, and ticking along by itself whilst you dedicate yourself to other tasks.

Firstly, one of the most useful things you can do is to spend some time getting your briefs watertight. Provide key details about the client, background information for the task such as the target audience, and clearly explain how this work fits into the wider strategy. Outline the deliverables clearly, and provide a step-by-step guide and examples if necessary.

Brief templates can help with this, especially if you’re producing different types of hygiene content for the same client. It will be worth it when you receive the work back exactly as needed, with minimal questions in the process, and future you will thank you.

Secondly, I strongly advise setting up trackers, because let’s face it: the benefit of a good Excel document cannot be underestimated. Create them so you know what stage your project is at from a glance and include pricing information and details of your freelancers. These trackers should essentially be a one-stop-shop for everything you need to know about the project. This will be invaluable not only for measuring where you are in the process but also for reporting.

Project tracking and management services such as Trello can be a godsend. Make use of them. Here at Croud we have our own proprietary technology, Croud Control, which allows us to manage huge content projects flexibly, with full visibility and control over every aspect of each project.

If this all sounds a little exhausting, why not use a trusted freelancer to manage this process for you? That way you only need to brief one person (although admittedly you will probably need to do a deep-dive), and providing you have regular check-ins along the way, you will only need to get involved at the final stage.

QA, QA, and QA again

Speaking of the final stage: check everything. Then check again.

It is unavoidable that your copywriters will make mistakes, as they are human beings. It’s also possible that your proofreaders will miss the odd spelling mistake here or there. This is the reason why I operate on a two-stage QA process at a minimum.

If your client is a multinational company, you may be required to translate or localize your copy into several different languages. It goes without saying that native speakers should perform the QA on this type of work, especially if the copywriter was a non-native speaker.

Providing your freelancers with feedback is crucial to the success of content projects, aside from just being a decent thing to do. After all, everyone wants to do a good job and more likely than not, wants to know how they could do it better.

Tight budgets mean you might have to get creative with how you manage it. This QA process allows me to do just that. If a new, potentially unexperienced copywriter with good writing skills and low hourly rate does the bulk of the work, the more skilled writers who are almost definitely more expensive can be lined up to proofread, check tone, and generally make sure it is up to scratch, in half the time it took to write it. Just make sure they don’t end up re-writing the work. Empower them to provide constructive feedback directly to your copywriters, and effectively train them up.

If your QAs pick up on the same mistakes being made repeatedly, allow your copywriters the opportunity to review their edits. If they can actually see the corrections being made, they are more likely to bear them in mind when they write for you again. If fewer edits are required, then congratulations, you have made the process even more efficient and cost-effective.

Summary

Creating high-quality, unique copy and content on a large scale is never going to be easy, but it doesn’t have to be painful. With a bit of legwork at the beginning to establish a well built process, and by making the most of a network of freelancers, it has the potential to be a breeze.

Not only that, but you and your clients will undoubtedly reap the commercial rewards of your hard work. Using exactly this process, together with our global network of 1,700+ freelancers known as ‘Croudies’, we were able to produce city-specific landing page copy for a client with hundreds of locations. This work led to a 113% increase in organic traffic, coupled with a 124% uplift in domain visibility.

And the key to success? Engage your writers at every available opportunity, so they don’t feel like a cog in a machine. Provide them with valuable feedback and help them whenever you can. This will likely not only improve your enjoyment of the project, but you’ll also probably find that they are more willing to help with future work. And when the whole project goes off without a hitch and you receive fantastic reviews (because why wouldn’t you), tell them of the good news and allow them to share in your success.

How I Boosted My Rankings Without Creating Content or Building Links

Posted by on Aug 28, 2018 in SEO Articles | Comments Off on How I Boosted My Rankings Without Creating Content or Building Links

How I Boosted My Rankings Without Creating Content or Building Links

I know what you are thinking, this isn’t impossible.

Because the more content you have and the more links you have, the higher your rankings will be.

Although that is true, it doesn’t mean that content marketing and link building are the only ways to increase your rankings.

It doesn’t matter what update Google rolls out, I’ve found that there are a few hacks that consistently work to boost your rankings without creating more content or building more links.

So, are you ready to find out what they are?

What does Google want to rank at the top?

Before I get into the exact “hacks” and tactics that can boost your rankings, I want to first help you change the way you think about SEO.

Do you think Google really cares about on-page SEO and link building?

Sure, it matters to some extent, but that’s not what Google cares about the most.

Google wants to rank websites that people love. If they ranked websites that you hated, then you would slowly stop using Google.

And if people stopped using Google, then there would be fewer people to click on their ads, which means they would make less money.

That’s why Google cares about what you think and they ideally want to rank the websites that you love.

Now let’s dive into some hacks that will make people love your site, which will boost your rankings.

And don’t worry… I am not going to give you some fluffy tactics, I have data to back up everything. 😉

Hack #1: Optimize your click-through-rate

Let me ask you this:

If 10,000 people performed a Google search for the term “SEO” and clicked on the number 2 listing instead of the number 1 listing, what would that tell Google?

It would tell them that the number 2 listing is more relevant and that Google should move that listing to the number 1 spot.

Rand Fishkin ran an experiment where he told all of his Twitter followers to perform a Google search for the term “best grilled steak” and to click on the first listing, hit the back button, and then click on the 4th listing.

Within 70 minutes the 4th listing jumped into the top spot.

And that page even started to rank at the top of page 1 for the term “grilled steak”.

The ranking eventually slipped back down because people didn’t really feel that the listing was that great compared to some of the other listings.

Instead, it only climbed because Rand has a loyal following and everyone helped trick Google to believe that it was more relevant (at least in the short term).

But this should give you a sense that Google cares what you think. So much so that they will adjust rankings in real time because they don’t want to show you pages that you feel are irrelevant (no matter how many backlinks the page has or how well its on-page code is optimized).

And Rand wasn’t the only person who tested out this theory. It’s been done a countless number of times and each time it produced similar results.

You want people to click on your listing more than the other ones out there. It’s that simple.

If you can generate more clicks (in a legitimate way) than the listings above you, eventually you’ll notice your rankings climb without having to write more content or build more links.

So, how do you get more clicks?

Well, you have to adjust your title tag and meta description tag to be more appealing.

Anytime you perform a Google search, you see a list of results. And each result has a title, URL, and description:

The link part is the title (also known as the title tag), then there is the URL (which is green in color), and lastly, there is the description (black text… that is also known as the meta description).

If you are running a WordPress blog, you can easily modify your title tag and meta description using the Yoast SEO plugin.

There are a few ways you can generate more clicks on your listing over the competition:

Include keywords – people tend to click on listings that include the keyword or phrase they just searched for. Make sure you are using the right keywords within your title and description (I will get to this in a bit). This may sound basic, but when your web pages rank for thousands of terms, which one do you include in your 60-character title tag?
Evoke curiosity – titles that are super appealing tend to generate clicks. For example, if the keyword you were going after is “green tea,” a good title would be “11 Proven Benefits of Green Tea (#6 Will Shock You)”. I know it may seem a bit long, but it works because a lot of people will wonder what number 6 will be.
Copy magazines – anytime you see a magazine, you’ll notice that they have appealing titles and headlines on the cover. A lot of their titles contain “how to” or are list oriented. Look at magazines for inspiration.

Improving your search listings isn’t rocket science. Where most people mess up is that they pick the wrong keywords or they are terrible at writing copy. Remember, humans are reading your title tag and meta description tag, so they need to be appealing.

If you are struggling writing appealing copy, read my ultimate guide to copywriting.

Now let’s go over the exact steps you need to take to get more clicks.

The first step is to use Google Search Console.

Log into Google Search Console, then click on “Search Traffic” and then click on “Search Analytics”:

You’ll see a page that looks something like this:

Scroll back up to the top and click on the “pages” radio button and “CTR” checkbox:

You’ll see a list of results sorted by your most popular URLs and their respective click-through-rate (also known as CTR):

Look for pages that have high traffic but a CTR of less than 5%.

Click on one of the listings with a CTR of less than 5% and then click on the “queries” radio button:

You’ll then want to look for the keywords with the highest amount of “clicks” and the lowest CTR.

Those are the keywords you want to focus on in your title tag and meta description.

Remember, your title tag is limited to roughly 60 characters, which means you won’t be able to fit more than 2 or 3 keywords.

So, you want to pick the keywords that typically have the most clicks. They should also have a low CTR because you selected pages with a CTR rate lower than 5%.

By adjusting your title tag and meta description to include the right keywords and by evoking curiosity, you’ll be able to increase your clicks. This will get you more search traffic in the short run and boost your rankings over time.

Here are 3 tests that worked well for me when I adjusted my title tag:

I noticed I was getting a lot of traffic for the term “marketing digital” from countries outside of North America on one of my posts.

So, I adjusted my title tag from saying “digital marketing” to “marketing digital” which took my CTR from 3.36% to 4.45%. It also increased my search traffic by 1,289 visitors a month.

With the key phrase “social media marketing,” I adjusted my title tag based on an idea I got from a magazine. My CTR went from 2.38% to 2.84%. In total, that increased my traffic by 932 visitors a month.

With my social media marketing title tag, I added the phrase “step-by-step guide.”

This lets people know it is a how-to related post and it is action oriented. I also added the word “social media” a few times within the meta description.

And with the query “Google AdWords,” I noticed that Google announced that they are switching their ad platform name from Google AdWords to Google Ads, so I did the opposite and focused more on the term “Google AdWords” because very few people knew about the name switch.

This helped drive an extra 1,355 visitors per month.

I’ve also had instances where the changes I’ve made had hurt my Google traffic.

So, whenever you adjust your title tag and meta description, mark that date down and look at the data within Google Search Console after 30 or so days to see if it hurt or helped.

If it hurt, revert it back and wait another 30 days. It can hurt your rankings if you continuously test. So when you have a losing variation, no matter what, wait 30 days as it will stabilize your rankings.

If the change helped boost your CTR and rankings, then you are off to a good start.

Now that you’ve optimized your click-through-rate, it’s time for you to optimize your user experience.

Hack #2: Show people what they want when they want it

If you go back to the experiment Rand Fishkin ran above, you’ll notice he told people to click the “back” button.

You don’t want people going to your site and clicking the back button… it will hurt your rankings.

People tend to click the back button because they don’t like what they see. If you can optimize your website for the optimal user experience, people will be less likely to click the back button.

I do this through 2 simple steps.

The first is to use Qualaroo and survey people. By asking people (right when they are on your website) a simple question of “how can I improve this page,” you’ll get tons of ideas.

You can even use Qualaroo to find out why people are visiting your website, which again will help you understand the type of people visiting your site. This will allow you to tailor your experience to them.

I ran a Qualaroo survey on my main blog page. The biggest feedback I got from you was that it was hard to find the exact content you were looking for.

And I know why too. It’s because I have marketing related content on everything. From ecommerce to SEO to content marketing…

I decided to try something out where when you land on the blog page, you can select the type of content that piques your interest and then all of the content gets tailored to your needs.

I also ran a Crazy Egg test to ensure that you like the change I made. Based on the Crazy Egg heatmap below, you can see that it was successful.

The bounce rate on my blog page dropped by 21% as well. 🙂

I then looked at the Crazy Egg scrollmap to see which elements/areas of the page have the most attention. This helped me determine where I should place the content filtering option.

The Crazy Egg scrollmap of my blog page shows that the content filtering option generates 70% of the page’s attention.

Placing the filtering in a place where there is a lot of attention ensures that I am giving you what you need in a place that is easy to find.

After you optimize your user experience, you want to focus on building a brand.

I recommend that you look at the pages on your site with high bounce rates and consider running this process in order to improve the user experience. When selecting the pages, make sure you are also picking pages that have decent traffic.

Hack #3: Build a brand

If you build a brand like Facebook or Amazon or any of the popular site, you’ll rank higher.

Eric Schmidt, the ex-CEO of Google, once said:

Brands are the solution, not the problem. Brands are how you sort out the cesspool.

I ran an experiment, which helped build up my brand and my search traffic skyrocketed (unintentionally).

My traffic went from 240,839 unique visitors per month in June 2016:

To 454,382 unique visitors per month by August 2016:

Once I realized the power of branding, I started a podcast called Marketing School, and I started to publish videos on YouTube, Facebook, and LinkedIn multiple times per week.

This has led me to generate 40,412 brand queries per month:

I’m even getting 3,806 brand queries per month on YouTube alone:

But as you know, producing good content doesn’t guarantee that your brand will grow.

Even if you build tools like me and release them for free (like what I did with Ubersuggest), it still won’t guarantee success.

But the one thing I have learned that works is the rule of 7.

When someone hears your message 7 times or sees it 7 times, they are more likely to resonate, build a connection, and continually come back.

So how do you get people to come back to your site?

The simplest solution that I’ve found to work is a free tool called Subscribers.

It leverages browser notifications to get people to “subscribe” to your website. It’s better than email because it is browser-based, which means people don’t have to give you their name or email address.

And then every time you want to get people to come back to your website, you simply send them a notification.

Look at how I’ve gotten over 42,316 people back to my site 174,281 times. That’s roughly 4 times per person.

Based on the rule of 7, I only have 3 more times to go. 😉

The way I use Subscribers is that I send out a notification blast every time I release a blog post.

The push looks something like this:

And instantly I’m able to get people back to my site:

When you start using Subscribers you won’t see results right away. It takes time to build up your subscriber base, but it happens pretty fast.

Typically, you’ll generate a browser notification subscriber three times faster than an email subscriber.

Conclusion

If you only focus on things like on-page SEO, link building, or even blogging, you won’t dominate Google.

Why?

Because that is what everyone else focuses on. You have to do more if you want to beat the competition.

By doing what’s best for the user, you’ll have a better chance of beating everyone else.

Just look at me, I do what every other SEO does plus more. Sometimes this causes my traffic to dip in the short run, but in the long run, it generally climbs.

From creating compelling copy so people want to click on your listing, to optimizing your user experience, to building a brand… you have to go beyond the SEO basics.

SEO has become extremely competitive. 5 years ago, it was much easier to rank at the top of Google.

If you use the 3 hacks above, here’s how long it will typically take to notice results.

Optimizing title tags – assuming you run successful tests, you can see small results in 30 to 60 days. Over time the results get even better.
Improving user experience – making your user experience better will instantly improve your metrics such as bounce rate, pageviews per visitor, time on site, and conversion rate. As for search rankings, it does help, but not instantly. Typically, it takes about 4 to 6 months to see results from this.
Brand building – sadly it takes years. Sure, tools like Subscribers will instantly grow your traffic, but it won’t impact your search rankings right away. You have no choice but to build a brand.

So which one of these hacks are you going to test out first?

The post How I Boosted My Rankings Without Creating Content or Building Links appeared first on Neil Patel.

Yoast SEO 8.1: Gutenberg part 2, introducing the snippet preview

Posted by on Aug 28, 2018 in SEO Articles | Comments Off on Yoast SEO 8.1: Gutenberg part 2, introducing the snippet preview

Yoast SEO 8.1: Gutenberg part 2, introducing the snippet preview

Two weeks ago, we launched Yoast SEO 8.0. In it, we shipped the first part of our integration with Gutenberg: the sidebar. That release was the foundation on which we are building the next parts of our integration with the new WordPress editor. In Yoast SEO 8.1, we introduce part 2: a Gutenberg-proof snippet preview. Also, a much better experience in the content analysis thanks to webworkers!

Optimize for synonyms and related keywords and prevent broken pages on your site with Yoast SEO Premium! »

$89 – Buy now » Info

Gutenberg, meet the Yoast SEO snippet preview

Yoast SEO 8.0, unfortunately, had to make do without a snippet preview inside Gutenberg. There were still some kinks to iron out before we could add that snippet preview to our WordPress plugin. The code for that new modal — the pop-up screen — had to be written from the ground up, exclusively for Gutenberg. That code has now been added to Gutenberg’s core so every WordPress developer can make use of the modal inside the new editor. How awesome is that!

Here’s what snippet preview pop-up inside Gutenberg looks like:

You see that it looks just like the regular Yoast SEO snippet preview. It has all the features you know and love, like the true-to-life rendering of your snippet on both mobile as well as desktop screens, SEO title field editor with snippet variables, slug editor and meta descriptions, also with snippet variables. To open the snippet preview, you simply click on the Snippet Preview button in the Yoast SEO Gutenberg sidebar.

Another cool thing now available in Gutenberg is the Primary Category picker. This has been a staple for many years in Yoast SEO. It lets you make and set the primary category for a post. This will be automatically selected whenever you make a new post. We will port more features over to Gutenberg shortly.

What’s next

We, of course, have big plans for Gutenberg. There’s still a lot to be done and not everything we’re dreaming up is possible right now. Step by step, we’re turning Yoast SEO and Gutenberg into a dream combination. We’re not just porting over existing features to the new Gutenberg, but actively exploring what we can do and what we need to do that. In some cases that means we have to develop the support inside Gutenberg’s core ourselves, this way loads of developers can benefit from the results as well.

Speeding up the content analysis with webworkers

Speed = user experience. To keep Yoast SEO performing great, we added a dedicated webworker to our content analysis. Webworkers let you run a script in the background without affecting the performance of the page. Because it runs independently of the user interface, it can focus on one task and does that brilliantly. Webworkers are very powerful and help us to keep Yoast SEO stable, responsive and fast even when analyzing pages with thousands of words of content. Try it!

The update is available now

Yoast SEO 8.1 has a lot of improvements behind the scenes that should drastically improve how the plugin functions. We are dedicated to giving you the best possible user experience, while also improving our current features and laying the groundwork for new ones. And not to forget that new WordPress editor, right? Update and let us know what you think!

Read more: Why you should buy Yoast SEO Premium »

The post Yoast SEO 8.1: Gutenberg part 2, introducing the snippet preview appeared first on Yoast.

The evolution of search: succeeding in today’s digital ecosystem – part 2

Posted by on Aug 28, 2018 in SEO Articles | Comments Off on The evolution of search: succeeding in today’s digital ecosystem – part 2

In the first part of our discussion on the evolution of search, we looked at the change in customer behaviors, which has led to a struggle between search engines and apps to remain relevant.

We also started to dissect key parts of the new digital ecosystem, looking in detail at the most obvious manifestation of these indirect answers, the information that powers these, and the change in mindset required to capitalize on the opportunities direct answers present. In this second part, we will consider further the outputs of the fundamental changes to search—and what this means for SEO as a channel in the future.

Voice is important, but we’re looking at it the wrong way

It wouldn’t be right to consider the evolution of search and featured snippets without discussing voice search. Many are looking to this as the new frontier for search, doubling down on strategies to become the answer to questions that people ask. Voice search is undoubtedly taking off in a big way, with 2016 being a turning point in the growth of the channel, but there are two challenges “voice marketers” will face: firstly, there is still a stigma to using voice in public—consumers may use quick commands, but they are yet to embrace the full capabilities of smart assistants among other people.

Secondly, smart speakers are becoming a part of people’s homes in a big way, with an estimated 40% of UK homes due to have an Amazon Echo in 2018. Despite this, companies will struggle to convince their audiences to receive unsolicited branded messages without permission. This is more of a problem in the wake of GDPR and claims of smart devices “listening in,” and I expect more tolerance to come in the future.

Until that point, it doesn’t matter if you’re the answer; users won’t know who has delivered the results they are listening to.

A much bigger opportunity in voice, although falling a little outside of the search marketer’s remit, are “skills.” When the app store launched, many of the first apps were utilitarian or games; the idea of a “branded” app was yet to be developed. However, as smartphones became ubiquitous, the prevalence of apps increased. I believe the same will be true of “skills.” For now, many of these provide data that the assistants cannot store first-hand, such as bus times and weather information. Over time, however, these could provide a branded experience for more conventional voice queries. Already, skills allow brands to provide a personalized response across voice. Importantly, as skills must be linked, these are solicited; or, put simply, you can brand the answers you give to user questions in an agreed format. Right now, this is a powerful tool; in the future, this will be a game-changer.

For those still looking to own the answers, owning the data feeds is key. While you can optimize for this in the same way as featured snippets, it’s harder to convince voice speakers, whose sole result has to be infallible or users will stop asking, that you are the one result to rule them all. This is why I believe Yext’s recent announcement that they will be pushing information directly to Alexa is as critical a change to search marketing as the launch of Penguin or Panda. For the first time, key data and knowledge feeds can be directly inputted into and brands can not only influence the information that Google, Amazon, Microsoft, and other platforms have on them (which is currently the case with answer optimization), they can own the narrative entirely.

As search engines look to promote results directly in search (whatever the format), this is a giant step forward towards the digital ecosystem of the future and should not be underestimated.

Speed and mobile are intrinsically linked; new formats will enable this

We’re all bored of hearing the phrase “content is king”—in fact, the “is king” moniker has been done to death. “Speed is king,” this probably does not carry the weight it needs to; and this is a shame, because it runs the risk of overlooking a crucial part of web marketing in 2018. From a pure SEO perspective, speed is now linked to improved visibility, in the same way that the interstitial ad penalty penalized sites for pop-ups.

However, if you’re blocking pop-ups or reducing your page load times for search traffic alone, you are firmly missing the point. This isn’t an “SEO thing.” This is a user experience essential, based on the changing demands of the digital-savvy customer in the modern age of technology; an audience that expects to quickly access the content they wish to furiously consume. Any delays or blockers in this process can be disastrous—not only to the brand, but to search engines as a whole.

Popular apps provide seamless, tailored experiences to their users; to stay as information leaders, this has to be replicated across search. A slow response, even if it’s not directly the fault of the provider, only serves to drive users away.

This is why Google is backing new formats; from accelerated mobile pages to progressive web apps and all device-focused changes (including in their index), the search giant is looking to improve the quality of the mobile web, a challenge it is uniquely well-positioned to undertake. As SEOs we should be embracing this—it’s better for our users. Yet we are limited by questions around tracking and data integrity (which Google is looking to change) and by the main engines’ ability to crawl and index JavaScript content, a programming language that will be key to bringing about the change that Google, Bing, and other providers need to stay relevant to their users.

For now, the biggest threat is mobile and apps; as other emerging technologies become more widely adopted, particularly in the immersive experience space, both the web and search engines will need to catch up to survive. And I believe that not only is it the responsibility of SEOs to drive forward these changes, it is both absolutely in our interest to do so and intrinsic to the continuation of investment in our channel. 

The future is bright, but SEO will never be the same

With the rise of apps and Google looking to push answers directly to users, reducing the importance of the website in the digital ecosystem, you could argue that the importance of SEO activity is dwindling. This would be a myopic view of the future; while the basis of our activity roadmap may change, there will be a requirement for optimization. As the major algorithm launches earlier in the decade fundamentally changed the way we operate and skills required to succeed in the channel, so too will the behavioral changes we are currently experiencing. As we have always done, we will adapt.

In his 2016 Brighton SEO talk, Jono Anderson argued that the digital marketer of the future will not need to learn new skill-sets but combine existing ones. For search marketers, this means focusing on specific areas of knowledge where we can be the most effective, instead of trying to know it all as we currently do. Most digital agencies have already separated content and SEO teams into two different, yet complementary work streams. Structuring technical and local experts into teams of their own is becoming more popular and in doing so, allows the marketers within them to shape their abilities around the requirements and objectives of their specialism.

Looking ahead, there will always be a place for search engines in the digital ecosystem, although their importance to the whole is yet to be decided. As such, there will be a continued opportunity (and need) for search marketing. The SEO of the future may be a very different person than now and the focus of digital agencies will be split between building brands, building web experiences, and structuring information to be easily understood by data feeds. But until agencies truly leave the ranking factors of the past behind and fully support this new digital world, powered by technology, convenience and customers, it will be at perpetual risk of becoming irrelevant to our audiences.

Writing great social media content for your blog

Posted by on Aug 28, 2018 in SEO Articles | Comments Off on Writing great social media content for your blog

Writing great social media content for your blog

I’ve always felt lucky blogging for Yoast.com. As I wrote before, I have an entire blog team that makes sure my post gets scheduled, is free of grammar or spelling errors and they publish it on social media. So I ‘only’ had to come up with an idea, which the team often helped me with, and type the post. I decided that if I ever were to outsource things on my own blog, it would be things like promotion and social media.

My struggle with social media

And then the inevitable happened. After I finished my previous post, I got a message: “Caroline, from now on, please write your own introduction for Facebook, Twitter, and the newsletter. Here’s some information for you. If you have any questions, let us know!” Hold on! Yes, I have questions! Starting with: “How do I do this?” and: “Do you have any idea how difficult it is to write short messages? There’s a reason I’m not active on Twitter!” And, so began my struggle, and search, for the ultimate social media messages.

Optimize your site for search & social media and keep it optimized with Yoast SEO Premium »

$89 – Buy now » Info

Because truthfully, I’d rather type a 2000 word essay than one sentence for Facebook. When you’re reading this, I’ve already grabbed your attention. You’ve already made it down to this point in my post, which means that you want to read my message. On social media, I can’t spend over a hundred words to make my point. If I do, you might not click, you might scroll past my message and you’ll never see my post at all.

And that’s how I started my two-day research. Two days? Yes. I, of course, started rather late with this blog post and had almost no time to conduct proper research. So, all the information in this post is based on my common sense – and I’ll teach you how to use your common sense too! Oh, how amazing my job is. Truly. Well, apart from having to write my own social media messages now.

To click or not to click

When do you click on a Facebook message? When do you hit the like button? When do you leave a reply? And when do you take the effort to go to someone’s profile and visit their domain through Instagram if there’s a ‘link in bio’ message underneath a photo? Those questions were the most important for me the last few days, to figure out what the perfect message entails. To find the answer to these questions, you need to know who your audience is.

For my blog, that’s a rather easy answer: the goal audience for my blog is me! And people like me, of course. But, I started my blog because I love writing. I’m right in the middle of my audience: young mothers (and fathers, of course) who are struggling with parenthood and want reassurance that others are struggling too. I want people to laugh at my stories, but also to take their struggles and life a little less serious, in order to enjoy life more.

Experimenting on different platforms

While people who visit my blog always tell me I have a great sense of humor – except for my husband, he still claims I have no humor at all – my Facebook page didn’t reflect my blog at all and come to think of it, I didn’t even like Facebook.

I started experimenting on Instagram: my photos were more blunt, I used a lot of hashtags (thirty hashtags seems to be the maximum) and I treated Instagram as if I was talking to my best friend. Immediately, my engagement went up. People responded to my photos with more than just a heart, they actually left messages! I started to get to know my audience more and more, and then a few days ago I decided I’d use the same strategy on Facebook.

I took a notebook and wrote down when I was interested in a Facebook post from another company, and when I scrolled past. And, although this is personal (and not perfect) research, this works for me, since I am a reflection of my own audience. I made notes on the posts I clicked on: what was the message they wrote? What was the title of the post? Did the image appeal to me? And when did I decide not to click on a post?

I found out that I click the link if these three aspects: text, title, and photo of the post, appeal to me. There are messages I saw multiple times but I didn’t click them, because the Facebook image wasn’t appealing enough, or the leading text was too vague or didn’t catch my attention.

Learn how to write awesome and SEO friendly articles in our SEO Copywriting training »

$199 – Buy now » Info

How to find your voice on social media

It’s important your social media reflects your website. If you write for solo travelers who are 20 years old, it’d be strange if your social media posts are more appealing to people who’d rather stay in and haven’t taken a vacation in the last 20 years. Just like you once found your voice for your blog, you need to find your voice on social media too. And you’ll have to experiment before you find it. Here’s how to experiment:

Realize that your social media are part of your brand

Facebook, Instagram, and other social media are extensions of your blog. Try to find the reason why you follow someone on Instagram, hit the like button on Facebook or retweet a message on Twitter. It’s probably because you feel connected to someone or to the brand. Those social media accounts should reflect the blog, in this case.

Write different introductions

By writing and rewriting your Facebook messages a few times, you will eventually find the voice that fits your brand. You can’t be as elaborate on Facebook or Instagram as you are on your blog. You need to catch people’s attention and get them to click that link to your website.

With Facebook, you can easily re-post a post that’s a couple of months old. Check which posts performed less: you can look that up on your Facebook page under ‘Statistics’. Check the accompanying message you wrote, try to rewrite them and see if you can gain more clicks.

It’s all about strategy

As much as you need a blog planning, you also need a social media planning and a strategy. If you post on Facebook only once a week, you probably won’t reach a lot of people. However, if you post once or twice a day, you’ll see your reach going up. Those posts don’t always have to be a link to your blog, especially not when you only blog every other day or once a week. Share images, ask questions, share links to other blogs in your niche or share quotes. Look at your competition and try to find a new angle to implement on your social media profiles.

Read more: How to use social media »

And now it’s time for me to write a nice introduction for social media so you’ll actually end up clicking and reading this message. Wish me luck. Oh and please drop your tips on me as well! You have no idea how much I learn from the comments you leave on my blog posts!

Keep reading: Social media strategy: where to begin? »

 

The post Writing great social media content for your blog appeared first on Yoast.