SEO Articles

Site Migrations & Creating New URL Structures

Site Migrations & Creating New URL Structures

Migrating or redesigning a site is a chance for new beginnings. New pages! New functionality! New systems!

But with new things comes new responsibilities. One of the first items to consider is your new URL structure. Taking time to discover the best URL structure for your site–and for SEO–will set you up for future success.

As an SEO overseeing a migration, your role will shift from an in-the-weeds analyst to a consultant and educator. You’ll be asked best practices and tasked with guiding teams through this unfamiliar world full of potential and new HTML tags. You’ll work with IT teams and developers you previously never knew existed.

Suddenly, you’ll be fielding questions about:

URL Structures
Which New Pages to Create/Delete
How to Organize the Site
Analytics

And the list goes on. If you’ve done your job up to this point, you’ll have advocates on other teams asking, “But what about the SEO implications.” They may not understand just how SEO works, but you’ve scared them with mentions of algorithm updates and ranking drops that any project – no matter how small – must be signed off by the SEO team.

Congrats! You’ve made yourself invaluable and involved in every decision. Before you think this is a bad thing, you’re actually #blessed.

While these pre-migration discussions may dominate your schedule, it’s better to be involved upfront and tell the UI team that Google views a blue vs. orange button equally than to be brought in at the 11th hour and learn *gasp* that no one’s thought of canonical tags.

URL Structures

If your site is upgrading its platform, 99% of the time that means your URLs will change, too. Hopefully, for the better. Gone are the days of CatID=12345 and hello keyword rich URLs!

This is your chance to establish ground rules. You’ll be able to create folders and character limits that will dictate URL structures for the foreseeable future.

Before You Dive In…

Google’s John Mueller has said to not change URLs just for SEO purposes. If you have the option to keep the same structures, don’t. touch. anything.

While it’s tempting, those URLs have been accruing authority for years, may be linked to from other sites, and you introduce risk and volatility when using redirects. Plus, it’s one extra thing you’ll have to manage.

Sometimes, though, you might not have a choice.

If you’re doing a full redesign and the new back-end systems necessitate a URL change, do it wisely. If the company is rebranding and the new brand comes with a fresh URL, make it count.

Strategically approach this time and infuse SEO best practices into your shiny new URLs.

Use Normal Naming Conventions

While keywords in URLs inform visitors what a page is about, they don’t provide the same ranking boost they once did.

In a 2016 Google Webmaster Hangout, Mueller shared that keywords in URLs are “a very small ranking factor. It’s not something I’d really try to force.”

This means you should encourage teams naming URLs to name them something intuitive, but don’t stress about finding the exact right phrase. If you’re between /car-repair/ or /auto-repair/ – use either! Google gets it.

Shorter URLs are Better Than Longer Ones

To quote Occam’s Razor, “The simplest solution is always the best.” The same is true when creating URLs.

If you’re considering if /new-homes-for-sale/ or /new-homes/ is best, save yourself nine characters and opt for /new-homes/.

Other pages on your site will provide context and you’ll save character limits by removing implied phrases. Shorter URLs are also easier to display on smaller screens for mobile searches.

What About Mobile?

After reading my first draft, my colleague asked that question. It’s what we all should be asking these days.

What about mobile URLs?

Responsive design is Google’s recommended design pattern.” Having your mobile and desktop pages responsively load on one www. domain is preferred, but you can handle separate m. and www. URLs by setting appropriate canonical and rel=alternate tags between the two.

If that’s how your site is configured, see Google’s guidelines on how to annotate.

URL structure is less important to mobile visitors from a visual standpoint. It’s an element of your site that few phone visitors may even see. On my 5.7 inch, XL phone, I see 28 characters in my Chrome browser.

When you think about it, that’s really small. It’s the equivalent of “https://www.examplesite.com/”. Everything beyond that is hidden.

With the common use of schema.org markup and Google getting better at understanding a site’s folder structure, mobile URLs also rarely show up in SERPs. All a user will see on their small screen is a list of folders.

If you have separate m. or www. URLs, apply the same best practices to both. Keep the URL structure the same between your domains to help with cross-device consistency. Since site users will rarely see your URLs and because Google doesn’t surface them anymore, don’t overthink mobile URL structure.

Subfolders Should be Used in Moderation

The number of subfolders to include can vary greatly; there is no one-size-fits-all recommendation. That’s because considering URL folders and the right number to expose gets tricky.

Our recommendation is to consider your specific implementation and the size of your site.

Let’s run through options.

For small sites, displaying 2-3 folders in a URL can provide users with additional context before even viewing the page. Your site is small and it’ll be easy to manage.

Perhaps you’re a restaurant with multiple locations and unique menus. You’d want to expose each of those locations within the URL to help customers know which location they’re viewing: /east-end/menu and /north-shore/menu. See how those URLs are better than /menu-1 or /menu-2?

Think of URLs as a way to tip-off visitors about what they’ll see before then get to your site.

For large sites – especially eCommerce ones – you’ll have more things to consider. You may have to decide whether to display parent categories (and how many) or perhaps you’ll want to name URLs according to their page template.

If you’re leaning towards a multi-folder URL path, you’ll want to consider what’s intuitive without being overkill. It’s very much a “trust-your-gut” situation.

Let’s imagine you’re a jeweler who sells gold solitaire engagement rings. You have a few options of the child-parent relationships for that landing page:

/rings/engagement/solitaire/gold (4 folders)
/solitaire/gold (2 folders)
/rings/gold-solitaire-rings (2 folders)
/gold-solitaire-rings (1 folder)

All are viable options, but which one provides context and is most intuitive? That’s up to you.

You also may want to use a naming convention based on page templates.

If, for example, you have a page template that shows products, you may want to name all those pages… wait for it… /products/. This would apply to your necklaces, rings and earrings pages.

/products/necklaces
/products/rings
/products/earrings

So which option is best? I wish we could give you an answer, but this one will depend on your site. To help you decide, sketch out one section of your navigation and what the URLs might look like. Does it feel like overkill if you have 6 folders when 2 will do? Are your users going to get lost by seeing a page that’s 5 folders deep? Does each folder provide value to your customers by being exposed?

It’s Okay to Be A Little Selfish

Once you’ve considered what’s best for your visitors, it’s time to be a bit self-serving. Is the solution you’ve developed also best for YOU? How will your internal teams use those URLs in their reporting, monitoring and general navigation around the site?

If you’re an internal team or an agency working with one client, you will see these URLs every. single. day. of. your. life.

Would it help to filter your data to product detail pages if every page lived within a /p/ folder? (Probably, yes.)

Do you want to view only earring pages so having a parent directory of /earrings/ would help you easily sort the data? (Probably, yes here, too.) Those with a decent subdirectory structure will be easily able to roll up behavior metrics by subdirectory in Google Analytics with an underused feature called the Content Drilldown report.

When sharing the Store Locator URL with your Social team, is it easier for you to share /stores/store-locations/view-all or more simply, /stores/? The more characters in a URL, the easier it is to mess it up.

Create New URLs with Purpose

A migration or site redesign is your chance to reset naming conventions for URLs. While keywords in URLs no longer have an impact on how a page ranks, creating simple, intuitive URLs are a win-win-win all around.

They’ll:

Help customers orient themselves on your site
Provide context for users when clicked on from an external link
Make life infinitely easier for reporting or daily SEO tasks

If your URLs need to change, embrace the opportunity and create a foundation that will benefit IT, Marketing and SEO teams for years to come.

The post Site Migrations & Creating New URL Structures appeared first on LunaMetrics.

Read More

The difference between on-site blog content and off-site content

Google recently updated its search quality rating guidelines, which has had a profound impact on the way that content is created. Publishing a revised 164-page document, the leading search engine is now paying greater attention on what users are searching for and what information they end up reading.

The tech giant has not been afraid to say that it has a focus on enhancing the user experience across the platform, and the changes that have been introduced for content creators reinforce this statement for marketers around the world.

While well-crafted onsite content can help strengthen your brand’s message and highlight your industry expertise, you’ll also need to produce creative offsite content that will help your business secure the best online coverage across a range of publications to increase rankings while amplifying your brand.

On-site content

There are multiple elements that cover on-site content, and when done correctly, effective on-site content can help increase your website’s search rankings. If you’re looking to become the go-to brand/service for your prospective customers, it’s crucial that you appear at the top of the results page.

Ultimately, blog content on your business website is there to support the user’s journey while providing them with the most insightful information that they need during their visit. This could also support them when making a purchase, as they see you as a more trustworthy figure. There are a few techniques you can use to make sure that your on-site blog content performs exceptionally well.

The first step to creating blog content is to understand who is reading it — usually this will be your main demographic who already have an interest in your products or services. Although you’ve positioned yourself as an authoritative figure, you need to speak to your website visitors as if they’re on your level for both acquisition and retention purposes.

You also want to avoid any industry jargon, as this can be an instant turn off for a reader. It’s important to be transparent with your audience and tell them the information that they need in a concise way that still delivers the same level of information.

You also need to use your blog content as a way to tell your audience that you’re better than your competitors. This can be achieved through showing off your USPs — whether these include next-day delivery or a lengthy warranty on products. If you’re creating an article on your site that drives information to the reader, they won’t mind you being slightly advertorial, as this can also be beneficial to them.

Internal links are a must in your blog post, but only if they are relevant. If you’re discussing a certain product or service that you offer, you should be linking to the relevant page to help improve the overall page authority.

It’s essential that you end your blog post with a call to action, because if a reader has made it all the way through your article, they’re already invested in your business and are more likely to perform an action.

Off-site content

Creating off-site content is completely different from making blog posts for your business website. This time, you’re not trying to appeal to your customers but to journalists and major publications that will drive authority to your website while having the ability to increase brand visibility.

It requires a full team of innovative and creative people to come up with outreach ideas that can support an SEO campaign. You should have an aim to create pieces of content that can be outreached to different publications that cover various niches. For example, an article that discusses how technology has improved health and safety in the workplace would appeal to technology, business and HR websites, all of which can improve your link building strategy for your online marketing campaigns.

This also means that you must carry out extensive research into what is relevant in the news. From an outreach perspective, this can allow you to see what type of content journalists are looking for and what is currently working well in terms of online coverage.

As well as this, you should also be looking at creating content around national or international events or celebrations — as editors are more likely to pick up this type of content because it will appeal to a wide audience and generate an overall buzz. Recently, we saw this with the World Cup and will soon see the same with the upcoming Christmas period.

Publications and journalists will not take content pieces that are too advertorial, as they want to provide readers with content that is informative and unbiased — but that is not to say they won’t credit you with either a brand mention or a link to one of your target pages.

Although content creation for both on-site and off-site may look similar, they can be very different in tone, format and objective.

 

 

Read More

Ranking the 6 Most Accurate Keyword Research Tools

Ranking the 6 Most Accurate Keyword Research Tools

Posted by Jeff_Baker

In January of 2018 Brafton began a massive organic keyword targeting campaign, amounting to over 90,000 words of blog content being published.

Did it work?

Well, yeah. We doubled the number of total keywords we rank for in less than six months. By using our advanced keyword research and topic writing process published earlier this year we also increased our organic traffic by 45% and the number of keywords ranking in the top ten results by 130%.

But we got a whole lot more than just traffic.

From planning to execution and performance tracking, we meticulously logged every aspect of the project. I’m talking blog word count, MarketMuse performance scores, on-page SEO scores, days indexed on Google. You name it, we recorded it.

As a byproduct of this nerdery, we were able to draw juicy correlations between our target keyword rankings and variables that can affect and predict those rankings. But specifically for this piece…

How well keyword research tools can predict where you will rank.

A little background

We created a list of keywords we wanted to target in blogs based on optimal combinations of search volume, organic keyword difficulty scores, SERP crowding, and searcher intent.

We then wrote a blog post targeting each individual keyword. We intended for each new piece of blog content to rank for the target keyword on its own.

With our keyword list in hand, my colleague and I manually created content briefs explaining how we would like each blog post written to maximize the likelihood of ranking for the target keyword. Here’s an example of a typical brief we would give to a writer:

This image links to an example of a content brief Brafton delivers to writers.

Between mid-January and late May, we ended up writing 55 blog posts each targeting 55 unique keywords. 50 of those blog posts ended up ranking in the top 100 of Google results.

We then paused and took a snapshot of each URL’s Google ranking position for its target keyword and its corresponding organic difficulty scores from Moz, SEMrush, Ahrefs, SpyFu, and KW Finder. We also took the PPC competition scores from the Keyword Planner Tool.

Our intention was to draw statistical correlations between between our keyword rankings and each tool’s organic difficulty score. With this data, we were able to report on how accurately each tool predicted where we would rank.

This study is uniquely scientific, in that each blog had one specific keyword target. We optimized the blog content specifically for that keyword. Therefore every post was created in a similar fashion.

Do keyword research tools actually work?

We use them every day, on faith. But has anyone ever actually asked, or better yet, measured how well keyword research tools report on the organic difficulty of a given keyword?

Today, we are doing just that. So let’s cut through the chit-chat and get to the results…

While Moz wins top-performing keyword research tool, note that any keyword research tool with organic difficulty functionality will give you an advantage over flipping a coin (or using Google Keyword Planner Tool).

As you will see in the following paragraphs, we have run each tool through a battery of statistical tests to ensure that we painted a fair and accurate representation of its performance. I’ll even provide the raw data for you to inspect for yourself.

Let’s dig in!

The Pearson Correlation Coefficient

Yes, statistics! For those of you currently feeling panicked and lobbing obscenities at your screen, don’t worry — we’re going to walk through this together.

In order to understand the relationship between two variables, our first step is to create a scatter plot chart.

Below is the scatter plot for our 50 keyword rankings compared to their corresponding Moz organic difficulty scores.

We start with a visual inspection of the data to determine if there is a linear relationship between the two variables. Ideally for each tool, you would expect to see the X variable (keyword ranking) increase proportionately with the Y variable (organic difficulty). Put simply, if the tool is working, the higher the keyword difficulty, the less likely you will rank in a top position, and vice-versa.

This chart is all fine and dandy, however, it’s not very scientific. This is where the Pearson Correlation Coefficient (PCC) comes into play.

Phew. Still with me?

So each of these scatter plots will have a corresponding PCC score that will tell us how well each tool predicted where we would rank, based on its keyword difficulty score.

We will use the following table from statisticshowto.com to interpret the PCC score for each tool:

Coefficient Correlation R Score

Key

.70 or higher

Very strong positive relationship

.40 to +.69

Strong positive relationship

.30 to +.39

Moderate positive relationship

.20 to +.29

Weak positive relationship

.01 to +.19

No or negligible relationship

0

No relationship [zero correlation]

-.01 to -.19

No or negligible relationship

-.20 to -.29

Weak negative relationship

-.30 to -.39

Moderate negative relationship

-.40 to -.69

Strong negative relationship

-.70 or higher

Very strong negative relationship

In order to visually understand what some of these relationships would look like on a scatter plot, check out these sample charts from Laerd Statistics.

And here are some examples of charts with their correlating PCC scores (r):

The closer the numbers cluster towards the regression line in either a positive or negative slope, the stronger the relationship.

That was the tough part – you still with me? Great, now let’s look at each tool’s results.

Test 1: The Pearson Correlation Coefficient

Now that we’ve all had our statistics refresher course, we will take a look at the results, in order of performance. We will evaluate each tool’s PCC score, the statistical significance of the data (P-val), the strength of the relationship, and the percentage of keywords the tool was able to find and report keyword difficulty values for.

In order of performance:

#1: Moz

Revisiting Moz’s scatter plot, we observe a tight grouping of results relative to the regression line with few moderate outliers.

Moz Organic Difficulty Predictability

PCC

0.412

P-val

.003 (P<0.05)

Relationship

Strong

% Keywords Matched

100.00%

Moz came in first with the highest PCC of .412. As an added bonus, Moz grabs data on keyword difficulty in real time, rather than from a fixed database. This means that you can get any keyword difficulty score for any keyword.

In other words, Moz was able to generate keyword difficulty scores for 100% of the 50 keywords studied.

#2: SpyFu

Visually, SpyFu shows a fairly tight clustering amongst low difficulty keywords, and a couple moderate outliers amongst the higher difficulty keywords.

SpyFu Organic Difficulty Predictability

PCC

0.405

P-val

.01 (P<0.05)

Relationship

Strong

% Keywords Matched

80.00%

SpyFu came in right under Moz with 1.7% weaker PCC (.405). However, the tool ran into the largest issue with keyword matching, with only 40 of 50 keywords producing keyword difficulty scores.

#3: SEMrush

SEMrush would certainly benefit from a couple mulligans (a second chance to perform an action). The Correlation Coefficient is very sensitive to outliers, which pushed SEMrush’s score down to third (.364).

SEMrush Organic Difficulty Predictability

PCC

0.364

P-val

.01 (P<0.05)

Relationship

Moderate

% Keywords Matched

92.00%

Further complicating the research process, only 46 of 50 keywords had keyword difficulty scores associated with them, and many of those had to be found through SEMrush’s “phrase match” feature individually, rather than through the difficulty tool.

The process was more laborious to dig around for data.

#4: KW Finder

KW Finder definitely could have benefitted from more than a few mulligans with numerous strong outliers, coming in right behind SEMrush with a score of .360.

KW Finder Organic Difficulty Predictability

PCC

0.360

P-val

.01 (P<0.05)

Relationship

Moderate

% Keywords Matched

100.00%

Fortunately, the KW Finder tool had a 100% match rate without any trouble digging around for the data.

#5: Ahrefs

Ahrefs comes in fifth by a large margin at .316, barely passing the “weak relationship” threshold.

Ahrefs Organic Difficulty Predictability

PCC

0.316

P-val

.03 (P<0.05)

Relationship

Moderate

% Keywords Matched

100%

On a positive note, the tool seems to be very reliable with low difficulty scores (notice the tight clustering for low difficulty scores), and matched all 50 keywords.

#6: Google Keyword Planner Tool

Before you ask, yes, SEO companies still use the paid competition figures from Google’s Keyword Planner Tool (and other tools) to assess organic ranking potential. As you can see from the scatter plot, there is in fact no linear relationship between the two variables.

Google Keyword Planner Tool Organic Difficulty Predictability

PCC

0.045

P-val

Statistically insignificant/no linear relationship

Relationship

Negligible/None

% Keywords Matched

88.00%

SEO agencies still using KPT for organic research (you know who you are!) — let this serve as a warning: You need to evolve.

Test 1 summary

For scoring, we will use a ten-point scale and score every tool relative to the highest-scoring competitor. For example, if the second highest score is 98% of the highest score, the tool will receive a 9.8. As a reminder, here are the results from the PCC test:

And the resulting scores are as follows:

Tool

PCC Test

Moz

10

SpyFu

9.8

SEMrush

8.8

KW Finder

8.7

Ahrefs

7.7

KPT

1.1

Moz takes the top position for the first test, followed closely by SpyFu (with an 80% match rate caveat).

Test 2: Adjusted Pearson Correlation Coefficient

Let’s call this the “Mulligan Round.” In this round, assuming sometimes things just go haywire and a tool just flat-out misses, we will remove the three most egregious outliers to each tool’s score.

Here are the adjusted results for the handicap round:

Adjusted Scores (3 Outliers removed)

PCC

Difference (+/-)

SpyFu

0.527

0.122

SEMrush

0.515

0.150

Moz

0.514

0.101

Ahrefs

0.478

0.162

KWFinder

0.470

0.110

Keyword Planner Tool

0.189

0.144

As noted in the original PCC test, some of these tools really took a big hit with major outliers. Specifically, Ahrefs and SEMrush benefitted the most from their outliers being removed, gaining .162 and .150 respectively to their scores, while Moz benefited the least from the adjustments.

For those of you crying out, “But this is real life, you don’t get mulligans with SEO!”, never fear, we will make adjustments for reliability at the end.

Here are the updated scores at the end of round two:

Tool

PCC Test

Adjusted PCC

Total

SpyFu

9.8

10

19.8

Moz

10

9.7

19.7

SEMrush

8.8

9.8

18.6

KW Finder

8.7

8.9

17.6

AHREFs

7.7

9.1

16.8

KPT

1.1

3.6

4.7

SpyFu takes the lead! Now let’s jump into the final round of statistical tests.

Test 3: Resampling

Being that there has never been a study performed on keyword research tools at this scale, we wanted to ensure that we explored multiple ways of looking at the data.

Big thanks to Russ Jones, who put together an entirely different model that answers the question: “What is the likelihood that the keyword difficulty of two randomly selected keywords will correctly predict the relative position of rankings?”

He randomly selected 2 keywords from the list and their associated difficulty scores.

Let’s assume one tool says that the difficulties are 30 and 60, respectively. What is the likelihood that the article written for a score of 30 ranks higher than the article written on 60? Then, he performed the same test 1,000 times.

He also threw out examples where the two randomly selected keywords shared the same rankings, or data points were missing. Here was the outcome:

Resampling

% Guessed correctly

Moz

62.2%

Ahrefs

61.2%

SEMrush

60.3%

Keyword Finder

58.9%

SpyFu

54.3%

KPT

45.9%

As you can see, this tool was particularly critical on each of the tools. As we are starting to see, no one tool is a silver bullet, so it is our job to see how much each tool helps make more educated decisions than guessing.

Most tools stayed pretty consistent with their levels of performance from the previous tests, except SpyFu, which struggled mightily with this test.

In order to score this test, we need to use 50% as the baseline (equivalent of a coin flip, or zero points), and scale each tool relative to how much better it performed over a coin flip, with the top scorer receiving ten points.

For example, Ahrefs scored 11.2% better than flipping a coin, which is 8.2% less than Moz which scored 12.2% better than flipping a coin, giving AHREFs a score of 9.2.

The updated scores are as follows:

Tool

PCC Test

Adjusted PCC

Resampling

Total

Moz

10

9.7

10

29.7

SEMrush

8.8

9.8

8.4

27

Ahrefs

7.7

9.1

9.2

26

KW Finder

8.7

8.9

7.3

24.9

SpyFu

9.8

10

3.5

23.3

KPT

1.1

3.6

-.4

.7

So after the last statistical accuracy test, we have Moz consistently performing alone in the top tier. SEMrush, Ahrefs, and KW Finder all turn in respectable scores in the second tier, followed by the unique case of SpyFu, which performed outstanding in the first two tests (albeit, only returning results on 80% of the tested keywords), then falling flat on the final test.

Finally, we need to make some usability adjustments.

Usability Adjustment 1: Keyword Matching

A keyword research tool doesn’t do you much good if it can’t provide results for the keywords you are researching. Plain and simple, we can’t treat two tools as equals if they don’t have the same level of practical functionality.

To explain in practical terms, if a tool doesn’t have data on a particular keyword, one of two things will happen:

You have to use another tool to get the data, which devalues the entire point of using the original tool.
You miss an opportunity to rank for a high-value keyword.

Neither scenario is good, therefore we developed a penalty system. For each 10% match rate under 100%, we deducted a single point from the final score, with a maximum deduction of 5 points. For example, if a tool matched 92% of the keywords, we would deduct .8 points from the final score.

One may argue that this penalty is actually too lenient considering the significance of the two unideal scenarios outlined above.

The penalties are as follows:

Tool

Match Rate

Penalty

KW Finder

100%

0

Ahrefs

100%

0

Moz

100%

0

SEMrush

92%

-.8

Keyword Planner Tool

88%

-1.2

SpyFu

80%

-2

Please note we gave SEMrush a lot of leniency, in that technically, many of the keywords evaluated were not found in its keyword difficulty tool, but rather through manually digging through the phrase match tool. We will give them a pass, but with a stern warning!

Usability Adjustment 2: Reliability

I told you we would come back to this! Revisiting the second test in which we threw away the three strongest outliers that negatively impacted each tool’s score, we will now make adjustments.

In real life, there are no mulligans. In real life, each of those three blog posts that were thrown out represented a significant monetary and time investment. Therefore, when a tool has a major blunder, the result can be a total waste of time and resources.

For that reason, we will impose a slight penalty on those tools that benefited the most from their handicap.

We will use the level of PCC improvement to evaluate how much a tool benefitted from removing their outliers. In doing so, we will be rewarding the tools that were the most consistently reliable. As a reminder, the amounts each tool benefitted were as follows:

Tool

Difference (+/-)

Ahrefs

0.162

SEMrush

0.150

Keyword Planner Tool

0.144

SpyFu

0.122

KWFinder

0.110

Moz

0.101

In calculating the penalty, we scored each of the tools relative to the top performer, giving the top performer zero penalty and imposing penalties based on how much additional benefit the tools received over the most reliable tool, on a scale of 0–100%, with a maximum deduction of 5 points.

So if a tool received twice the benefit of the top performing tool, it would have had a 100% benefit, receiving the maximum deduction of 5 points. If another tool received a 20% benefit over of the most reliable tool, it would get a 1-point deduction. And so on.

Tool

% Benefit

Penalty

Ahrefs

60%

-3

SEMrush

48%

-2.4

Keyword Planner Tool

42%

-2.1

SpyFu

20%

-1

KW Finder

8%

-.4

Moz

0

Results

All told, our penalties were fairly mild, with a slight shuffling in the middle tier. The final scores are as follows:

Tool

Total Score

Stars (5 max)

Moz

29.7

4.95

KW Finder

24.5

4.08

SEMrush

23.8

3.97

Ahrefs

23.0

3.83

Spyfu

20.3

3.38

KPT

-2.6

0.00

Conclusion

Using any organic keyword difficulty tool will give you an advantage over not doing so. While none of the tools are a crystal ball, providing perfect predictability, they will certainly give you an edge. Further, if you record enough data on your own blogs’ performance, you will get a clearer picture of the keyword difficulty scores you should target in order to rank on the first page.

For example, we know the following about how we should target keywords with each tool:

Tool

Average KD ranking ≤10

Average KD ranking ≥ 11

Moz

33.3

37.0

SpyFu

47.7

50.6

SEMrush

60.3

64.5

KWFinder

43.3

46.5

Ahrefs

11.9

23.6

This is pretty powerful information! It’s either first page or bust, so we now know the threshold for each tool that we should set when selecting keywords.

Stay tuned, because we made a lot more correlations between word count, days live, total keywords ranking, and all kinds of other juicy stuff. Tune in again in early September for updates!

We hope you found this test useful, and feel free to reach out with any questions on our math!

Disclaimer: These results are estimates based on 50 ranking keywords from 50 blog posts and keyword research data pulled from a single moment in time. Search is a shifting landscape, and these results have certainly changed since the data was pulled. In other words, this is about as accurate as we can get from analyzing a moving target.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Read More

#4 – Is Passive Income Possible with SEO?

#4 – Is Passive Income Possible with SEO?

I don’t think there’s a single person who wouldn’t want passive income. In fact, that’s one of the biggest reasons why people get into SEO. For well over a decade, Internet Marketers have been selling the idea that you can generate passive income by learning how to do SEO.

But the question is:

Is it actually possible? That’s what this episode of The SEO Life podcast is all about. Let’s jump in.

The first thing I need to establish is that SEO can be a passive channel for growing a business.

But what determines whether or not your business generates passive income is completely dependent on the monetization model.

I’m not going to hammer you with the fact that SEO is one of the best ways to grow a business. You already know that. What I want to focus on is how you can use SEO to generate passive income.

First, I need to cover what business models will NOT generate passive income.

Non-Passive Model #1 – Service Business

The first is service-based businesses. It’s extremely hard to have a passive service-based businesses. So whether you’re offering SEO services or plumbing services, it’s still going to require human capital to complete the tasks. You can build out systems, but you still need people to operate the business and get the desired results. Then you have to actually manage your team members.

And lastly, service-based businesses have to sell and deal with customer service. All of these moving parts make it challenging for a service-based business to be passive.

Non-Passive Model #2 – Personal Brands

The second type of business that is NOT passive is one that revolves around your personal brand. For example, the success of Gotch SEO is largely dependent on my personal brand. Sure, I have team members and systems for making things more efficient, but at the end of the day, people are only going to care about Gotch SEO if I’m consistently putting out value and continually try to establish myself as an SEO expert.

Now the way I generate income for my company (Gotch SEO) is passive because I sell an information product, Gotch SEO Academy. Anyone can go through my funnel and sign-up for the course. This doesn’t require me to sell each person 1-by-1. It’s passive in that respect.

But the part that isn’t passive is maintaining and building my personal brand. In short, I’m the engine that pushes this business forward. This means that Gotch SEO is NOT a passive income business model.

Listen:

The idea that you should be building your personal brand is incredibly valuable because it can lead to so many opportunities. But that doesn’t mean that all of your business ventures should revolve around it.

That’s why last month I launched a new business that isn’t dependent on my personal brand and I plan to launch a few more. I’ll update you with the progress of these businesses on my blog.

But at the end of the day, all you need to do is look at the most successful companies in the world. They aren’t personal brands. Sure, there are some anomalies such as Kylie Jenner, but even her brand is completely dependent on her. Anyway, I won’t get too deep into this topic, but building a business or website around your personal brand isn’t a passive solution.

So, then what is a passive income model you can use?

The Best Passive Income Option

I believe the most passive income model you can take advantage of if you understand how to do SEO is to create niche websites and then to monetize those niche websites through affiliate offers, ads, collaborations, and e-com or information products in some cases.

I would say that affiliate and ad-driven businesses are the most passive because you completely eliminate one of the most time-intensive and complicated parts of a business which is selling. Sure, you need to pre-sell affiliate offers, but that’s much easier than having to build out sales funnels, write copy, create sales videos, etc.

But at the end of the day, every business including affiliate businesses are going to require a massive amount of planning, work, and time. The beauty of SEO is that once it’s rolling, you will continue to get traffic even if you aren’t actively working on the site.

Why Chasing Passive Income Doesn’t Make Sense

But all I know is that for me personally, I don’t think I would ever stop trying to create new revenue sources or working to grow something. Even if I have passive income rolling in. I mean seriously, life is pointless if you’re aren’t challenging yourself and continually try to grow.

My business is a position where I can go on vacation whenever I want and do what I want on a daily basis, but I still don’t do that. I still work like I’m broke.

Trying to grow businesses is fun for me. Sure the income is cool, but I genuinely love the process.

And the reason why I’m saying this is because I think you’ll end feeling the same way. Once you achieve the goal of “passive income,” you won’t be satisfied. You won’t just sit back and not doing anything anymore. I can predict this because if you put in the work and create a passive income asset, then you’re not someone who just sits back and lets life come to them.

You’re someone who takes action and someone who is willing to take on risk. That behavior isn’t going to change once you get passive income. Trust me!

But either way, there are some business models that are more desirable than others. I personally love building niche websites because I like having control. Some people like service-based businesses. That’s cool too. You just have to figure out what works for you and most importantly, what makes you the happiest.

So, that’s all for this episode. I’m going to be diving deeper into developing niche websites in later blog posts and videos because I believe it’s the best way for people to monetize their SEO skills. With that said, thank you so much for listening and talk soon!

Read More

Building visual reporting in Google Sheets

Building visual reporting in Google Sheets

If you didn’t already know, Google Sheets has a fantastic Google Analytics add-on that can be used to create custom reports in Google Sheets. For more information on the reporting capabilities of this tool, read this blog post from 2016, which will also teach you how to download the add-on and set up a custom report configuration. As an overview, this add-on allows you to:

Quickly pull any data from Google Analytics (GA) accounts you have access to directly into a spreadsheet
Easily compare historical data across custom time periods
Filter and segment your data directly within Google Sheets
Automate regular reporting
Easily tweak your existing reports (which will be saved to Google Drive) to get new data

Beyond how to use the tool – we have free stuff!

All the heaps of data you can pull with this tool are useful, but what if you want to quickly be able to compare data from your custom report configurations? Wouldn’t it be nice to have a reporting view that visually displays how your website is performing week-on-week (WoW) or year-on-year (YoY) by comparing the number of organic sessions and orders (and is near-automated)?

I thought so too.

Recently, I built a report using the Google Analytics add-on within Google Sheets. I have created a templated version of this report to share with you. Feel free to make a copy of it and use it as you please.

Start creating your own Google Sheets reports.

Here are some of the insights this report provides you with:

Organic sessions and orders WoW and YoY for the entire website
Organic sessions and orders WoW and YoY for different page types including category, content, product and seasonal pages
Organic sessions and orders WoW and YoY for the homepage and a static top 20 pages
Organic sessions and orders WoW and YoY for your mobile website

Using formulas, some regular expressions, and conditional formatting, their weekly SEO reporting process is now nearly automated using data from their Google Analytics.

Wait, can’t I do all of this in the GA interface already?

Not quite. Here are some of the benefits of this add-on over the standard GA interface:

In the add-on, you can filter on dimensions or metrics that are not already included in your report. In the GA interface if you’re looking at a report with landing pages as the only dimension, you can’t use filters to filter to just organic traffic. With the add-on, you can

You can do this in GA using a segment rather than a filter, but segments are more prone to causing issues with sampling than filters

Once you have loaded in your data with the add-on, you can manipulate it without having to continually export files
You can do multiple comparisons with the add-on (which I do in this report), whereas in GA you can only do two, i.e. either year on year or week on week comparisons, not both simultaneously
Using the add-on can provide you with a single source of truth, rather than having all the additional data offered by GA that you may not want to be diving into
Using conditional formatting in Google Sheets means that I have been able to flag varying degrees of positive or negative changes by colour

How you can make this report your own

This blog post will walk you through why the report is useful, how you can customise it, and then if you’re curious, I’ve also gone into further detail in an appendix below how the report works. This will also be useful for any potential de-bugging you may have to do.

With the following instructions, even if you’re a beginner to things like regular expressions and Excel/Google Sheets formulas, you should still be able to customise the report and use it yourself.  

To build the report I’ve used regular expressions within the reporting configuration to filter to specific page types that I wanted, and then in the “Comparisons” sheet, I’ve used formulas to pull the data from the sheets to then get WoW and YoY comparisons. All of this takes place within the sole Google Sheet.

For a one-time report, this would likely not be worth the time invested, but if you or a client have a need for standardised, ongoing reporting – and you have access to the GA data for the account – then this template can be a great way to give you quick, easy insight into your organic traffic trends.

It saves me close to an hour of time a week – or almost 6 working days a year.

What you will need to customise this report

To make this report your own, you’ll need:

To download the Google Analytics add-on for Google Sheets (instructions are here)
Access to the GA account you want to report on
Your GA View ID (instructions on how to find this here)

Other resources you may find useful:

GA’s Query Explorer – can be used to test the output of different combinations of metric and dimension filters
GA Reporting API – lists and describes all the dimensions and metrics available through the Core Reporting API

Why this report is useful

This report uses conditional formatting to make any significant positive or negative changes stand out. It also uses both macro-scale views of the website trends and more detailed views. At the top, it has the total sessions and orders for the entire website, plus the WoW and YoY comparisons, and as you go further down the report it becomes more granular.

I built this report so I could get a better idea of how specific parts of a website were performing. The primary pages I have reported on are the category pages, content pages, product pages and mobile pages. Following that, I have put in the data for a list of the top 20 pages, including the home page. At the end, there is a section for seasonal pages.

The top 20 pages that it reports on are static. These were pre-determined by looking at those pages that consistently had the highest organic sessions. We opted to use a static top 20 rather than the actual top 20 by sessions per week because using the actual would require updating the SUMIF formulas each time the report was run.

This is a report that I update weekly. The date formulas are calculated based on whatever today’s date is and are then used in the report configuration.

This allows the dates to update on their own without me manually having to change them each time I want to run the report. I just had to then schedule the report to run weekly and BAM! – no hands necessary.

To set an auto-run for your report, go to Add-ons > Google Analytics > Schedule Reports, check “Enable reports to run automatically” and then set the time and frequency you want your report to run.

This has made my life much easier, and hopefully sharing it will make your life easier too.

How to customise this report

In this report you are going to have to customise:

Your report configurations
The dates you want this to run
The primary page types you want to compare (we have category pages, content pages, product pages and mobile)
The top 20 pages you wish to report on (you might choose not to use this)
Your seasonal pages, if applicable

Necessary changes – Report Configurations

First level customisation

To learn how to set up and run report configurations, the blog post I referenced at the beginning can help you. For the purpose of this post, I’ll just focus on where you’ll need to tweak it for your website or client.

You will need to put in the View ID you want to report on (Row 3), and you will have to edit the regular expressions in the filters section (Row 9) to make them unique to your client, which I will cover below.

The dates I am using (Rows 4-5) are references to those I have in the comparisons tab. If you want to use different dates, you can either manually change them here, or in the next section, I explain how the date formulas work. Here’s a screenshot of the formula in cell B4 so you can see what I am talking about:

You also are likely going to want to change the Report Name (Row 2) for each column. If you do, be sure that you clearly label each section. The Report Name becomes the name of the sheet that is generated once you run the report, and later the name used in the formulas in the Comparisons tab.

Note that when you change the Report Name, it won’t replace the old one but will instead just create a new one. You’ll have to manually delete the old, unwanted sheets.

Updating the regular expressions

There are two parts of the regular expression that are unique to the website that you will have to update.

The first section that you’ll have to update is where I had to filter out PPC data that was being mistakenly reported as organic by GA.  For this site, PPC data could be identified with any URL that contained either “gclid”, “cm_mmc”, “newsl”, or “google” – this is likely to be different for you, so change what is in the quotations to reflect this.

This was being reported as organic because of the tagging used for PPC data was initially intended for a different reporting platform, so it may not be a problem for you. If so, you can delete this section.

If this is not a problem for you, then you can go ahead and delete this part of the Filters section (everything following ga:medium==organic in cells B9-D9 of the Report Configuration tab).  

The second update you’ll have to make to the regular expressions are to those used in the Filter sections for all the columns aside from the ‘Everything’ ones (cells E9-P9). These are used to identify the part of the URL path you want to filter on.

Each filter is separated by a semi-colon, so if you want to add anything to these filters be sure to have that in there. Semi-colons mean “AND” in the Core Reporting API. For commas, you use “OR”.

Here is the ‘everything’ section:

ga:medium==organic;ga:[email protected];ga:[email protected]_mmc;ga:[email protected];ga:[email protected]

Aside from ga:medium==organic (which just filters to organic sessions only) this just filters out PPC data.

I’ve copied this expression across all of my sections, but for the sections on specific page types I’ve also included another regular expression to get the specific URLs I am looking for, highlighted below.  For these sections, you’ll see variations of this:

ga:medium==organic;ga:landingPagePath=~/category/;ga:[email protected];ga:[email protected]_mmc;ga:[email protected];ga:[email protected]

For this example, it was filtering for URLs containing “/category/”. This filters that report down to just our client’s category pages. Again, you can customise this regular expression to your unique website or client. Be sure to escape any slashes you use in this section with a backslash.

The mobile sections (cells N9-P9) were a bit different, as this is a defined dimension in GA. You’ll see in those columns that I just added in “ga:deviceCategory==mobile” after the filter for organic.

Once all that is done you can run your reports and move on to customising your Comparisons tab.

Necessary changes – Comparisons tab
Date formulas

The date formulas in cells M13:S18 further automate the reporting. The report defines a week as Monday to Sunday as this was how our client defined theirs, so if this is different for you, you’ll have to change it. If you’re curious how these specific formulas work, I have covered it in more detail in the appendix.

If you do change this section, make sure that the dates are formatted as YYYY-MM-DD. To do this, go to Format > Number > More formats > More date and time formats.

I’ve also left space to enter custom start and end dates. The specific client this was built for wanted to be able to compare odd weeks for their YoY comparisons around specific holidays. These dates will only be used if cells N16-S16 are not blank.

Google Sheets formulas – for primary, top 20 and seasonal pages

Once you’re happy with the dates, the primary thing you need to update are formulas, specifically the names of the sheets being referenced and the criteria that define the pages you want to report on.

If you are getting errors when you customise the formulas, especially #N/A! errors, try re-running the cells in the comparisons sheet first by just highlighting and pressing enter.

For the primary pages at the top in cells B6:K10, if you have changed the Report Names from the previous section you only have to update the sheet names being referenced. You’ll also have to do this for the following sections.

When you’re doing this, be sure not to mix up previous week and previous year.

This can be a long and irritating process. One thing I found that helped speed it up was another Google Sheets add-on Advanced Find and Replace. This lets you use the find and replace function within formulas, which means you can simply find “Everything current week – UK” and replace it with whatever alternative you have.

This plug in has a free trial, and once that is up you can only use it once a day – so make the most of it while you have it! If you know of any other free alternatives, I’d love to hear about them.

The formulas in the top 20 pages, cells B13:K24, have slightly different formulas are different depending on the page type.

Where I’ve highlighted in the formula below is the part of this formula you’ll have to change to match your specific page type. This is from cell B14:

=SUMIF(‘Everything current week – UK’!$A:$A,”*”&”/top page 2/”,’Everything current week – UK’!$B:$B)

The number you’re seeing is a sum of all the pages with /top page 2/ in the URL from the Results Breakdown in my Everything current week – UK tab, shown below.

For the seasonal section in cells B34:K35, you’ll just have to replace where I have either “christmas” or “black-friday” to include whatever specific seasonal term you want to report on. Remember, this must be a reference that is included in the URLs.

Other changes you can make – Report Configuration

For metrics, I have used sessions and transactions, but this can be adjusted if there is a different metric you wish to report on. Just be sure to change the headings in the comparison tab so you remember what you’re reporting on.

For dimensions, I have used the landing pages. Again, you can adjust this if you wish to, for example, report on keywords instead.

I’ve set the order to be in descending rather than ascending. This organises the data but also helped to determine the top 20 pages.

I have set the limits on these to 1,000. I did this because I only really cared about the specific data for the top pages. The limit does not change the total number that is reported, it just limits the number of rows.

Unfortunately, this is also where I have to talk about sampling. In my report tabs in cells A6 and B6 it says “Contains Sampled Data, No”. If your data is being highly sampled then you need to decide if that will be a roadblock for you or not.

Here is a resource with some ways to get around sampled data.

It’s reporting time

If you’ve made the above changes, once you run your reports with the updates to your Report Configuration, you should have a Google Sheet reporting on your specific data.

That was a lot of information, so if you have any questions or need any help on a specific part of this process please comment below!

As promised, I’ve added an appendix to this post below for those of you that are curious to know in more detail how it works.

Happy reporting!

Appendix: How this report works, if you’re curious
Main report formulas
Totals, WoW and YoY for top report section

Columns B and G for the top section simply pull out whatever number is reported for the total sessions and total orders from each sheet. This is useful not only because it brings all the absolute numbers into one place, but also because I can now reference these cells in formulas.

For WoW relative (Column C), I’ve again referenced those same cells, but created a percentage with a (Current – Previous)/Previous formula.

Column D uses the percentages generated in Column C to extract the absolute number differences.

For YoY relative (Column E), I’ve followed the same exact method, just referencing the data for the previous year rather than the previous week. Again, I used these numbers to extract out the absolute numbers seen in Column F.

The grey orders section does the exact same thing, but instead references the cell in each respective configuration with the order total, rather than sessions.

I’ve also wrapped these formulas in IFERRORs, to prevent the sheet from having any error messages. This was primarily for aesthetics, although it is worth noting that sometimes this can lend to it saying there was a 0% change, when maybe there was a 100% increase as that page type did not exist in the previous year.

Date formulas

Our client wanted weekly reporting comparing weeks that run from Monday to Sunday as this was how our client defined theirs. Since GA weeks run from Sunday to Saturday, this had to be customised.

These dates are calculated based off the “=TODAY()” date in cell M14, as well as the first day of last year calculated in M16, the first Monday of last year in M18, and the week numbers in cells O12 and Q12.  

Because these dates are calculated automatically here, in the Report Configuration tab I can simply reference the specific cells from my Comparisons sheet, rather than manually having to enter the dates each time I run the report. This also made it so I can set this report to run automatically every Monday morning before I get into the office.

You’ll also notice that below the dates I have left space to enter custom start and end dates, this is again because the specific client this was built for sometimes wants to compare odd weeks for their YoY comparisons to account for specific holidays.

In the Report Configuration sheet, I have an IF formula in the cells that says, if the custom cells are blank then use the usual date, if they are not, then use those. On those occasions, it does mean I have to manually run the reports, but I guess you can’t have everything.

Top 20-page reporting

The Top 20-page section is where the formulas get a bit beastly, but this was something the client specifically requested. We initially wanted it to report on the top 20 pages from each week, but that wasn’t possible using formulas, as we needed something static to reference.

For these, I used a SUMIF formula. For example, in cell C13 I have this formula to report the WoW relative number for the home page:

=IFERROR((SUMIF(‘Everything current week – UK’!A:A,”*”&”.co.uk/”,’Everything current week – UK’!B:B)-SUMIF(‘Everything previous week – UK’!A:A,”*”&”.co.uk/”,’Everything previous week – UK’!B:B))/SUMIF(‘Everything previous week – UK’!A:A,”*”&”.co.uk/”,’Everything previous week – UK’!B:B),0)

Again, the IFERROR statement wrapped around my formula is just to clean things up so lets drop that and break down what the rest of this formula is doing.

=(SUMIF(‘Everything current week – UK’!A:A,”*”&”.co.uk/”,’Everything current week – UK’!B:B)-SUMIF(‘Everything previous week – UK’!A:A,”*”&”.co.uk/”,’Everything previous week – UK’!B:B))/SUMIF(‘Everything previous week – UK’!A:A,”*”&”.co.uk/”,’Everything previous week – UK’!B:B)

The SUMIF formula sums up cells if they meet specific criteria. It works by defining the range, in this case ‘Everything current week – UK’!A:A (every row in column A of the sheet Everything current week – UK), and then the criteria that you want to be summed. Here, it is all cells which include anything and end with “.co.uk/”.

Lastly, you define the sum range, which is the range to be summed if it is different from the original range defined. We’ve used this here because we want the sum of all the sessions, not the landing page paths. That whole thing spits out the sum of all the sessions on the homepage for the current week. I’ve then subtracted from that number the sum of all the sessions for the previous week.

Finally, I’ve divided it by the sum of all the sessions of the previous week to get the percent change.I set formatting rules in these cells to format the numbers as a percentage, but you could also just add that the formula to multiply by 100. So within these cells there are two things you are going to have to customise (1) the names of the sheets being referenced, and (2) the criteria that define the pages that you want to report on. You’ll notice that in the top 20 pages, these are different depending on the page type (they have been intentionally changed for discretion).

Read More

Medic: Google’s Latest Algorithm Update

On August 1, Search Engine Roundtable broke the news about a Google algorithm update nicknamed “Medic” that they found was already shaking up search results and core rankings almost overnight.

It’s still a little too early to weigh in strongly on what changed with this update, but we wanted to share everything we’ve been keeping track of over the last week.

The Medic Update

Google was quick to characterize this as a global update. Our team hasn’t personally seen any universal ripples across our client rankings and traffic, but trusted analysts in the market have been vocal about the cross-client impact they’re measuring. What everyone has been able to agree upon to this point is that the impact was felt disproportionately by medical, fitness, and health verticals, hence the name “Medic.”

Early indications from data compiled by SE Roundtable suggest that high domain authority sites received a little extra boost in rankings, at the expense of smaller sites. If that proves true it’s not exactly earth-shattering in terms of implications for online marketers. Rather, it’d mean the big “G” is simply rewarding sites that have been providing content, gathering links, and so forth for a long time, building up a war-chest of DA.

We’ll hold off on the rants about democratizing search results by rewarding quality content and value to users above all else until we’ve got a bit more data.

So What Happened?

Our team is still analyzing data from our extended client portfolio before we share a full evaluation of the Medic update. As with other updates in the recent past, Google will almost certainly be doubling down on their expectation that sites show expertise in subject matter, trustworthiness, and that they create quality content. What specific changes come out of this algorithm update should become more clear in coming weeks, and we’ll update this post accordingly.

Further Reading

Search Engine Land: Google’s Aug. 1 core algorithm update
Marie Hayes Consulting: Google Update Strongly Affected YMYL Sites
Digital Journal: New Google Ads and Algorithm Updates
Reddit: Initial Reactions
Reddit: Ongoing Discussion

The post Medic: Google’s Latest Algorithm Update appeared first on Portent.

Read More