Blog

What is an XML sitemap and why should you have one?

Posted by on Oct 5, 2018 in SEO Articles | Comments Off on What is an XML sitemap and why should you have one?

What is an XML sitemap and why should you have one?

A good XML sitemap acts as a roadmap of your website which leads Google to all your important pages. XML sitemaps can be good for SEO, as they allow Google to quickly find your essential website pages, even if your internal linking isn’t perfect. This post explains what XML sitemaps are and how they help you rank better.

What are XML sitemaps?

You want Google to crawl every important page of your website, but sometimes pages end up without any internal links pointing to them, making them hard to find. An XML sitemap lists a website’s important pages, making sure Google can find and crawl them all, and helping it understand your website structure:

Yoast.com’s XML sitemap

Above is Yoast.com’s XML sitemap, created by the Yoast SEO plugin and later on we’ll explain how our plugin helps you create the best XML sitemaps. If you’re not using our plugin, your XML sitemap may look a little different but will work the same way.

As you can see, the Yoast.com XML sitemap shows several ‘index’ XML sitemaps: …/post-sitemap.xml, …/page-sitemap.xml, …/video-sitemap.xml etc. This categorization makes a site’s structure as clear as possible, so if you click on one of the index XML sitemaps, you’ll see all URLs in that particular sitemap. For example, if you click on ‘…/post-sitemap.xml’ you’ll see all Yoast.com’s post URLs (click on the image to enlarge):

Yoast.com’s post XML sitemap

You’ll notice a date at the end of each line. This tells Google when each post was last updated and helps with SEO because you want Google to crawl your updated content as soon as possible. When a date changes in the XML sitemap, Google knows there is new content to crawl and index.

Even better SEO with Yoast SEO Premium!

Optimize your site for the right keywordsNever a dead link in your site againPreviews for Twitter and FacebookGet suggestions for links as you write$89 – Buy now ▸ More infoIf you have a very large website, sometimes it’s necessary to split an index XML sitemap. A single XML sitemap is limited to 50,000 URLs, so if your website has more than 50,000 posts, for example, you’ll need two separate XML sitemaps for the post URLs, effectively adding a second index XML sitemap. The Yoast SEO plugin sets the limit even lower – at 1.000 URLs – to keep your XML sitemap loading as fast as possible

What websites need an XML sitemap?

Google’s documentation says XML sitemaps are beneficial for “really large websites”, for “websites with large archives”, for “new websites with just a few external links to it” and for “websites which use rich media content”.

Here at Yoast, while we agree that these kinds of websites will definitely benefit the most from having one, we think XML sitemaps are beneficial for every website . Every single website needs Google to be able to easily find the most important pages and to know when they were last updated, which is why this feature is included in the Yoast SEO plugin.

Which pages should be in your XML sitemap?

How do you decide which pages to include in your XML sitemap? Always start by thinking of the relevance of a URL: when a visitor lands on a particular URL, is it a good result? Do you want visitors to land on that URL? If not, it probably shouldn’t be in your XML sitemap. However, if you really don’t want that URL to show up in the search results you’ll need to add a ‘noindex, follow’ tag. Leaving it out of your XML sitemap doesn’t mean Google won’t index the URL. If Google can find it by following links, Google can index the URL.

Example 1: A new blog

Say, for example, you are starting a new blog. You will want Google to find new posts quickly to make sure your target audience can find your blog on Google, so it’s a good idea to create an XML sitemap right from the start. You might create a handful of first posts and categories for them as well as some tags to start with. But there won’t be enough content yet to fill the tag overview pages, making them “thin content” that’s not valuable to visitors – yet. In this case, you should leave the tag’s URLs out of the XML sitemap for now. Set the tag pages to ‘noindex, follow’ because you don’t want people to find them in search results.

Example 2: Media and images

The ‘media’ or ‘image’ XML sitemap is also unnecessary for most websites. This is because your images are probably used within your pages and posts, so will already be included in your ‘post’ or ‘page’ sitemap. So having a separate ‘media’ or ‘image’ XML sitemap would be pointless and we recommend leaving it out of your XML sitemap. The only exception to this is if images are your main business. Photographers, for example, will probably want to show a separate ‘media’ or ‘image’ XML sitemap to Google.

How to make Google find your XML sitemap

If you want Google to find your XML sitemap quicker, you’ll need to add it to your Google Search Console account. In the new Search Console, you can find the sitemaps in the ‘Index’ tab. You’ll immediately see if your XML sitemap is already added to Search Console. If not, you can add your sitemap on top of the page:

Yoast.com’s XML sitemap added to the new Google Search Console

Within the old Google Search Console you can see your sitemaps by navigating to ‘Crawl’ and then clicking on ‘Sitemaps’.  Click on the ‘Add/Test sitemap’ button which you see on the right of the arrow in the image below if you haven’t added your XML sitemap.

Yoast.com’s XML sitemap added to the old Google Search Console

As you can see in the image, adding your XML sitemap can be helpful to check whether all pages in your sitemap really have been indexed by Google. If there is a big difference in the ‘submitted’ and ‘indexed’ number on a particular sitemap, we recommend looking into this further. There could be an error preventing some pages from being indexed or maybe you need more content or links pointing to the content that’s not been indexed yet.

Yoast SEO and XML sitemaps

Because they are so important for your SEO, we’ve added the ability to create your own XML sitemaps in our Yoast SEO plugin. XML sitemaps are available in both the free and premium versions of the plugin.

Yoast SEO creates an XML sitemap for your website automatically. Click on ‘SEO’ in the sidebar of your WordPress install and then select the ‘Features’ tab:

In this screen, you can enable or disable the different XML sitemaps for your website. Also, you can click on the question mark to expand the information and see more possibilities, like checking your XML sitemap in your browser:

You can exclude content types from your XML sitemap in the ‘Search Appearance’ tab. If you select ‘no’ as an answer to ‘show X in the search results?’ then this type of content won’t be included in the XML sitemap.

Read more about excluding content types here.

Check your own XML sitemap!

Now you’ve read the whole post, you know how important it is to have an XML sitemap, because having one can really help your site’s SEO. Google can easily access your most important pages and posts if you add the right URLs to your XML sitemap. Google will also be able to find updated content easily, so they know when a URL needs to be crawled again. Lastly, adding your XML sitemap to Google Search Console helps Google find your sitemap fast and it allows you to check for sitemap errors.

Now go check your own XML sitemap and make sure you’re doing it right!

Read more: WordPress SEO tutorial: definite guide to higher ranking »

The post What is an XML sitemap and why should you have one? appeared first on Yoast.

SEO Title Tags (Everything You Need to Know)

Posted by on Sep 28, 2018 in SEO Articles | Comments Off on SEO Title Tags (Everything You Need to Know)

SEO Title Tags (Everything You Need to Know)

Optimizing your title tags for SEO is simple:

Just throw your keyword in the title and you’re good to go, right?

Yes and no.

You could stop there and probably do pretty well (if you’ve done everything else right).

But the truth is:

There’s so much more you can do to optimize your title tags.

That’s what this guide is all about.

Make sure you read until the end because I’ll be sharing some title tag optimization tactics that will skyrocket your organic search CTR.

Let’s jump in.

What is a Title Tag?

As the name suggests, a HTML title tag is an element of your web page’s HTML code that indicates its title. It is often used to let both search engines and people know what the page’s content is all about.

You can only have one title tag per page. It will appear in your code as:

<head>
<title>Example of a Title Tag</title>
</head>

Most people will encounter your title tag in four places:

1. Web Browser Tabs

The title tag can be seen on your web browser when you open your page in a new tab.

This is especially helpful when a user has many tabs open and would like to go back to your content. Because of this, it’s important that your title tags are unique, easily recognizable. and can be immediately differentiated from other open tabs.

2. Browser Bookmarks

Browser bookmarks on Chrome show the website’s title by default. As you’ll notice below, the titles are usually truncated when it’s on the “Bookmarks Bar”.

However, you can see most of a page’s title if you’re using folders. This is a good reason why you should use short, but descriptive titles. More on this soon.

3. Shared Media on Social Media Platforms

You know those little previews on Facebook and Twitter when someone shares content on those platforms? Your title tag will show up there as well, letting people know what the page is about and what they can expect to find when they click on that link.

Some social networks will allow you to customize your title tag just for their platform. An enticing title tag helps draw in more visitors.

If you’re on WordPress, you can customize your OG data using Yoast and All-in-One SEO pack. You can also download download this OG plugin. It doesn’t require any set up and it will ensure that your “Featured Image” shows up when people share your content on social.

If you’re having issues with your Featured Image not showing, use the following:

Facebook’s Debugger tool (you can force Facebook to recrawl your page).
LinkedIn’s Post Inspector
Twitter’s Card Validator

4. In the SERPs (Search Engine Results Pages)

One of the most important places where your title will show is in Search Engine Results Pages (that includes Google, Bing, Yahoo, DuckDuckGo, etc).

The title tag shows up as a big, blue clickable link above a short meta description or summary.

This means that if someone found your web page by searching a term that is related to your business, this is your first chance to make a lasting impression and convince them to click on your website.

It’s very easy to add a title tag to your website, but writing an effective one takes time, research, and a little skill (that’s easily developed).

But first:

Why are Title Tags Important for SEO?

Some blogs will tell you that title tags are obsolete in 2018. This is misleading. While title tags may not play the same role in SEO as they did a decade ago, there are still many reasons not to neglect this low-effort, high-impact SEO action.

Here are the benefits of optimizing your title tags (the right way):

1. Keyword Rankings

Do you need to place your target keyword in the title tag to rank well in Google?

The short answer is “Yes”.

The longer answer is that it may not be as important as it once was.

Brian’s research found that having the keyword in the title tag does impact rankings, but it’s a small factor in comparison to other factors:

Image Source: Backlinko.com

Ahrefs also found that “there’s a slight correlation between the usage of keywords in the title tag and rankings.”

Image Source: Ahrefs.com

And finally, one last case study from Matthew Barby also indicated that “The presence of keywords in the page title” does correlate to higher rankings.

Image Source: MatthewBarby.com

Truth be told:

I’ve never attempted to rank pages without using the target keyword phrase in the title tag.

That’s because it wouldn’t make sense me to stop doing what’s working.

My recommendation will continue to be that you should place your target keyword in the title tag. Just keep in mind that it’s a small factor in the larger ranking equation.

2. SERP Click Through Rate (CTR)

Although there’s some debate about CTR being a ranking factor, there’s no denying that increasing your CTR will increase your organic search traffic.

And just to be clear:

The goal of SEO is to get more organic search traffic. When you change your mindset from “rankings” to “traffic” it changes the way you operate.

Optimizing your title tag for maximum CTR is an intelligent action to take.

I’ll explain some tactics you can use to achieve that goal in a second.

Side note: I lean towards CTR being a direct or least an indirect ranking factor. The way I look at is there’s no benefit of NOT optimizing for CTR. Even if it isn’t a ranking factor.

Ross Hudgens from Siege Media has an excellent video on this topic, worth a watch:

.embed-container { position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden; max-width: 100%; } .embed-container iframe, .embed-container object, .embed-container embed { position: absolute; top: 0; left: 0; width: 100%; height: 100%; }

TL;DW: CTR may not be direct ranking factor, but it likely impacts rankings indirectly.

3. Social Sharing

Your page’s title is a focal point when it’s shared on social media. Does that mean you need to use clickbait titles like this?:

No, but you should think about why clickbait works.

The truth is clickbait is only annoying when the actual content doesn’t add real value.

4. Headlines Matter

What you place in your title tag is nothing more than a headline. You’ve probably heard the idea that only 8 out 10 Internet users will read past the headline.

Or that:

“Five times as many people read the headline as read the body copy. When you have written your headline, you have spent eighty cents out of your dollar.” – Confessions of an Advertising Man (1963) by David Oglivy

The truth is:

If you’re reading this, then you’re in the minority.

In fact:

Most people only make it through around 17-20% of my content before returning back to watching cat videos.

But regardless, the copy you use within your title tag is the first touch point for readers.

You have to do it well or your engagement will be low.

Those are four important reasons why you need to optimize your title tag, but now I need to cover a few important questions:

Does Google Rewrite Titles?

If Google doesn’t think that your title is relevant, readable, or provides value to your site’s visitors, it can and will completely rewrite it – and often in ways that you won’t like.

In fact, here’s what Gary Illyes said:

“We will never quit rewriting titles. We’ve seen so many sites whose title really suck. A lot of sites have no title; a lot of sites have a title saying “Top Page”. In fact, Google almost always rewrites titles. We couldn’t provide useful results to our users if we quit rewriting titles. Experiments showed us users preferred written titles. So, we’ll continue to write titles.” – Gary Illyes (Source)

It’s pretty clear based on Gary’s words that Google’s algorithms will rewrite your titles (and isn’t planning on stopping anytime soon).

But what can you do to prevent it?

The #1 thing you can do is make sure that your title matches your page’s content/intent. If your title is “Buy Shoes”, but your page is all about “buying blue Nikes”, then Google will likely rewrite your title.

Your title should be a 100% match of the page content.

One other factor you need to consider is title tag length.

How Long Should Your Title be?

There are technically no character limits to your title tag, but search engines can only display so much of your title before cutting it off.

If your title is too long, Google will cut it off with an ellipsis (…), which could potentially prevent site visitors from seeing important information about the page.

According to Moz’s research, Google usually displays the first 50-60 characters (including spaces) of a title tag, but the more accurate limit would be 600px. This is because some characters (like M, W, etc.) take up more space than others.

Staying under 60 characters is a good rule of thumb, but you can also use many title tag preview tools like this one just to be sure.

If you’re on WordPress, Yoast and All-in-One SEO pack will do the job.

If you want to find titles tag that are too long at scale, then I recommend using Screaming Frog.

Open up Screaming Frog, enter your target domain, click on the “Page Titles” tab, and select “Over 65 Characters” from the filter:

You can click on each individual URL and preview what the title tag looks in the SERPs. Just click “SERP Snippet” at the bottom:

Can Your Title Tag and H1 be the Same?

The short answer is, yes. You should try to keep your H1 tag consistent with your title tag, but that doesn’t mean it has to be an exact match. For example, this page has a different title tag and H1 tag:

One method you can use is to vary your H1 from your title tag to rank for more long-tail keyword variations. I prefer keeping my H1 nearly identical to the title, but it’s elements to test for sure.

You can use Screaming Frog to find all titles that are the same as your H1 tags.

Open up Screaming Frog, enter your target domain, click on the “Page Titles” tab, and select “Same as H1” from the filter:

With some of those important title tag questions out of way, let me show you:

22 Easy Ways to Optimize Your Title Tags for SEO

Since we’ve already established that a good title tag is a low-effort way to optimize both your SERP ranking and your CTR, how exactly do you go about writing one?

Here are 22 ways to optimize your title tags for better rankings, CTR, and social sharing:

1. Focus on the Content First

That’s right. The first action you need to take is to make sure your SEO content is the highest quality possible. It doesn’t matter how well you optimize your title tag if the page itself is low-value.

Getting the click is important, but getting visitors to dwell longer, visit more than one page, or complete a goal is what the objective should be. That’s only possible if you’re crafting effective SEO content.

Don’t take this step lightly!

2. Identify the Page Type

How you craft your titles will depend on the page type. For example, optimizing a title tag for a product page will be much different than a blog post.

There are a few different types of SEO-driven pages that a website will have:

Homepages

If you decide to optimize your homepage for a target keyword, there’s a good chance it will have middle or bottom of the funnel intent. For example, Hubspot targets “inbound marketing software” with their homepage.

This keyword phrase has transactional intent so their homepage is structured to drive leads for their software (not educate).

Notice the effective use of a curiosity gap at the end of their title tag as well.

Category Pages

E-commerce websites are the most likely candidate to try to rank category pages. However, there are some information-driven websites where it makes sense.

For example, RTINGS have a beautifully-structured category page for the target keyword phrase “tv reviews”.

Although the keyword phrase “tv reviews” may lead to sale in the future, I still consider it to be top of the funnel intent. Or, informational in nature.

Notice that RTINGS front-loads their primary keyword phrase and use not one, but two modifiers (“Best” and “2018).

Product Pages

Many product pages will target a combination of Navigational/Transactional keyword phrases. For example, take a look at the keyword phrase “Nike trout 4 cleats”.

Someone searching this keyword is primed to buy, so the title tag needs to reflect that intent.

Local Pages

Keyword stuffing title tags seems to be a common practice on the local level. After digging around, I was able to find an interesting example for the keyword phrase “Los Angeles personal injury lawyer”.

Although I don’t love the idea of jamming “car accident lawyers” in the title, I do like a few things about this title. First, they’ve front-loaded their primary keyword. Second, they’re using numbers within their title, which makes it much more eye-grabbing.

Blog Posts

Crafting title tags for blog posts is the easiest to understand.

Your goal should be to make your title as accurate and interesting as possible. The following tips can drastically improve your blog post title performance.

Most blog posts are going to target keyword phrase with Informational intent, so you need to satisfy that.

3. Satisfy Searcher Intent

This applies to both your title and the page itself. The best way to satisfy searcher intent is to think about it from a funnel or buyer journey perspective.

There are four primary categories of searcher intent:

Informational – These are top of the funnel search queries such as “what is SEO”.
Comparison – These are middle of the funnel search queries such as “Ahrefs vs Moz”.
Transactional – These are bottom of the funnel search queries such as “Moz free trial”.
Navigational – These types of search queries are branded like “Gotch SEO”. This means the searcher already knows your brand or may already be a customer.

Most keyword phrases will fall under one or more of these categories.

Your title must satisfy the search intent behind keyword phrase you’re targeting. You do not want ambiguity. Make it as clear as possible for the searcher.

4. Front-Load Your Primary Keyword

If you approach crafting your title tags from a searcher intent perspective, it would make sense to have the keyword phrase front-and-center. If someone’s searching for “best baseball cleats”, they’re likely to click on a result that showcases that keyword right away.

Keep in mind that “front-loading” doesn’t mean that your keyword phrase needs to be first in the title tag. It just needs to be towards the beginning.

5. Write for Searchers, Not Search Engines

Yes, place your keyword in your title, but don’t do this:

“SEO Company | SEO Agency | Chicago SEO Company”

You wouldn’t believe how often we find this type of keyword stuffing in our SEO audits (check out our SEO audit service if you need help).

There a few reasons why you shouldn’t stuff keywords in your title tag:

It’s Not Necessary

Google’s algorithms are much more sophisticated than before. More specifically, Google’s Hummingbird algorithm is designed to understand content better.

That means it can identify synonyms and variations of your keywords. You don’t need to jam keyword variations into your title tag. Instead, you can place keyword variations or synonyms naturally throughout your copy and you’ll still perform well for them (given you did everything else right).

You Should Only Target One Primary Keyword Phrase Per Page

Although there are some exceptions to the rule (super authoritative websites), you should aim to target one primary keyword per page.

You’re Losing Precious Real Estate

Most keyword phrases aren’t persuasive in any way. When you stuff your title tag full of keywords, you’re losing the ability to add elements of effective copywriting and persuasion. I’ll be explaining some of these tactics in a second.

6. Use Shorter Titles

Matthew Barby’s research found that shorter titles tend to perform better in Google:

Image Source: MatthewBarby.com

Try to stay below 60 characters (including spaces).

If you’re struggling to keep it below 60 characters than you should try:

Avoid using all-caps in your title tag. Capital letters take up more space than lowercase letters.
Avoid using punctuation when necessary
Remove redundant or repetitive words
Use short phrases instead of long, complicated ones

7. Avoid Duplicating Page Titles

No two pages (that you want indexed in Google) should have the same title. The best way to find duplicate page titles is to use Screaming Frog SEO Spider.

Open up Screaming Frog SEO Spider, enter the target domain, and click on the “Page Titles” tab:

Then click the “Filter” dropdown and select “Duplicate”:

Sort the list by “Title 1”:

You only need to be concerned about duplicate title tags if your page is indexed. The new version of Screaming Frog makes this super easy with their new “Indexability” column.

8. Write Unique Titles for EVERY Page

Every page on your website should have a unique title. In fact, according to Google:

“Titles are critical to giving users a quick insight into the content of a result and why it’s relevant to their query. It’s often the primary piece of information used to decide which result to click on, so it’s important to use high-quality titles on your web pages.” – Google

The best way to find pages with missing titles is to once again, use Screaming Frog SEO Spider.

The steps are identical as above except you’ll select “Missing”:

9. Use Title Modifiers

If you dig through my content on Gotch SEO, you’ll discover that I love using title modifiers. I believe using title modifiers is one of the best ways to drive more long-tail traffic (without much effort).

I actually call this The Phantom Technique because many of these keyword variations are largely untapped.

Here’s a free video from my paid training course, Gotch SEO Academy explaining how to execute this tactic:

Like this training? Join Gotch SEO Academy today and save 20% when you use coupon code “titletag” at checkout.

With that said:

Some simple title modifiers you can use are “top”, “best”, or the year.

Important note: If it’s relevant to use a year in your title tag, make sure that your URL doesn’t include it. For example, I update my anchor text guide every year and change the year in the title tag, but the URL never changes.

That means I can continue to build the authority of that page because my URL isn’t changing every year.

10. Build a Keyword Variation List

I also build a keyword variation list every time I find a new primary keyword phrase to target. For example, my primary keyword phrase for my backlinks guide is “backlinks”.

But obviously my title couldn’t just be “Backlinks | Gotch SEO” because that’s A) boring and B) I would lose out on long-tail traffic.

Instead, I searched for relevant keyword variations I could naturally add to the title.

Ahrefs Keyword Explorer is perfect for this task.

Enter your primary keyword phrase, start the analysis, and then click on “Phrase Match”:

This section is a goldmine for finding keyword variations for your title.

You can also use UberSuggest and Keywords Everywhere to build your keyword variation list (both are free).

Although you won’t use 99.9% of these variations in your title tag, a large percentage of these keywords can dispersed throughout your page.

11. Emphasize Freshness

Do you know anyone that prefers old content? I don’t and that’s why emphasizing “freshness” in your title works really well.

One persuasion principle that I picked up from Frank Kern is that people love “new” things. In fact, something simply being “new” can be a big driving force.

Hence the reason why you’re more likely to buy a newer model car than a car from the 80s.

Another example if when you see a training course use “2.0” or “Revamped” in their headline. They’re emphasizing freshness.

Some ways to incorporate freshness into your title tags are to use the word “new”, “updated for YEAR”, “new data”, etc.

12. Use the H & W Strategy

The H & W strategy simple: Just use one of the following words in your title tag:  “How,” “What,” “Why,” “When,” “Where,” or “Who.”

How to {Create|Learn|Build|Use|Leverage|Increase|Get|Do}…

Example: How to Tie a Windsor Knot

Total Organic Keywords: 5,079
Total Linking Root Domains: 161
Total Social Shares: 819 (Buzzsumo)

What {are|is}?

Example: What Are Second Cousins vs. Cousins Once Removed

Total Organic Keywords: 2,600
Total Linking Root Domains: 59
Total Social Shares: 1.9 Million (Buzzsumo)

Why

Example: Why the Myers-Briggs Test is Meaningless

Total Organic Keywords: 2,500
Total Linking Root Domains: 77
Total Social Shares: 19,000 (Buzzsumo)

When

Example: 21 High-Protein Snacks To Eat When You’re Trying To Be Healthy

Total Organic Keywords: 1,800
Total Linking Root Domains: 32
Total Social Shares: 28,000 (Ahrefs)

Where

Example: The Complete Guide to Where to Put Your Eye Makeup

Total Organic Keywords: 5,200
Total Linking Root Domains: 33
Total Social Shares: 26,000 (Ahrefs)

13. Use Numbers

We’ve all been victim of consuming numbered listicles at one point or another. That’s because they’re super effective.

According to a study by Conductor, they found that 36% of respondents preferred headlines that included numbers:

Image Source: Moz.com

An example of an effective listicle post is “18 Unforgettable Countries Where You Can Roll Big on $50 a Day“. This example ranks for “cheapest countries to visit” (~3,600 searches/mo), has 45 linking root domains, and over 81,000 social shares.

Outside of the traditional listicle, you can also use monetary values such as: “Silicon Valley’s $400 Juicer May Be Feeling the Squeeze

Or, you can use percentages in title tags like this: “Nike’s online sales jumped 31% after company unveiled Kaepernick campaign“.

14. Use This Secret Title Tag Hack (Copywriters Hate It)

Ahh… yes, the classic clickbait headline.

I know I’ve fallen for many, but that’s because they work well! Mainly because they leave open loops in your mind and engage our natural human curiosity.

The trick here is to give readers a sneak peek into what they can find out by clicking on your link without giving too much away.

Employ as much tantalizing language as necessary; remember: you need to evoke surprise, amazement, or speak to a deeply-rooted fear. You can combine this technique with the other techniques above to create a truly click-worthy headline.

Example: 7 Unbelievable Exercises That Will Help Keep Your Nose In Shape

Total Organic Keywords: 3,500
Total Linking Root Domains: 17
Total Social Shares: 12,000 (Ahrefs)

Note: Use clickbait tactics few and far between because they can be annoying or unauthentic. Overuse could hurt your brand’s perceived value.

15. Be the Most Comprehensive

Fear of Missing Out (FOMO) applies in many different scenarios, but especially with knowledge gaps. People want assurance that they aren’t missing out on any important information.

That’s why {Complete|Ultimate|Definitive} guides work well.

Example: The Ultimate Guide To Brunching In NYC

Total Organic Keywords: 3,300
Total Linking Root Domains: 62
Total Social Shares: 48,000 (Ahrefs)

16. Emphasize Speed (or Time Savings)

One of the most powerful benefits to emphasize is saving time. Although this usually applies to products, it can be emphasized in title tags as well.

Use words like “fast”, “quick”, “simple”, etc.

Example: How to Get Rid of Stretch Marks Fast

Total Organic Keywords: 4,200
Total Linking Root Domains: 113
Total Social Shares: 160,000 (Ahrefs)

17. Break the Pattern

Pattern interrupts are common in video content, but there are ways to break the pattern in the SERPs as well. Some the best methods are use [brackets], {curly brackets}, (parentheses), equal signs (=), plus (+) or minus (-) signs, or pretty much any unordinary symbol.

You can also test using Emojis in title tags as well. Google doesn’t always show them though.

18. Use Title Tags to Find Keyword Cannibalization

Keyword cannibalization occurs when two or more pages on your website are optimized for the same keyword phrase. Auditing your title tags using Screaming Frog SEO Spider is actually one of the fastest ways to identify keyword cannibalization.

Open up SFSS, enter your target domain, click on the “Page Titles” tab, and keep the filter set to “All”:

You can then use SFSS’s built-in search function to find pages that are similar. In this example below, I searched “backlinks” and identified two pages using that primary keyword phrase.

In this case, it doesn’t make sense to consolidate these assets because the intent behind “how to build backlinks” vs “buy backlinks” are much different.

Identifying keyword cannibalization issues requires manual analysis, but it’s time well spent.

19. Test Your Titles

How do you know if your title will be effective? Well, the good news is that it doesn’t have to be a shot in the dark. I recommend using AM Institute’s tool to test and refine your titles before going live:

You can also use CoSchedule’s free headline analyzer tool as well.

20. Incorporate All the Methods

The good news is that you don’t need to be exclusive with what techniques you use. Mix and match the title tag optimization methods to get the best results possible.

21. Measure Performance with Google Search Console

Google Search Console shows you CTR data for your organic keywords. Just click on the “Performance” tab and you’ll access to all kinds of useful data:

Although your CTR is determined by more than just your title tag, it’s one of the most important factors. If you are ranking well, but your CTR is subpar, then you should test changing your title.

Here’s a simple title tag testing framework I use:

Create 10-20 title variations
Qualify the idea using AM Institutes tool
Execute the change
Annotate the change in Google Analytics
Wait (at least 3-4 weeks) – You need to give Google time to recrawl the page and see whether there’s a positive or negative impact.

The goal of these tests is to increase CTR.

Keep in mind: Navigational search queries (that aren’t your brand name) like “Blogspot” (I’ve been floating between the #2 – #5 spot) will have low CTR:

Changing your title tag won’t do much in this scenario because it’s based on intent.

On the other hand:

Navigational search queries that ARE for your brand (branded search) should have exceptionally CTR:

22. Be Realistic

All of these methods will help you optimize your title tags for peak SEO performance.

But don’t forget:

Placing your keyword in your title tag is a micro ranking factor.

Think of it as the bare minimum for ranking well.

That’s All for Title Tags!

I hope this guide helped you learn a thing (or two) about title tags.

If you got a lot of value out of this post please share it and drop a comment below because I respond to every single one

The Secret Behind My 1,866,913 Monthly Search Visitors (It’s Not What You Think)

Posted by on Sep 21, 2018 in SEO Articles | Comments Off on The Secret Behind My 1,866,913 Monthly Search Visitors (It’s Not What You Think)

The Secret Behind My 1,866,913 Monthly Search Visitors (It’s Not What You Think)

How many visitors do you think NeilPatel.com generates each month?

Maybe a million… maybe 2 million?

I bet you’re going to guess 1,866,913.

If that’s what you guessed, you are wrong. This blog actually generated 2,530,346 visitors. 1,866,913 is the number that came from search engines.

So, what’s the secret to my ever-growing Google traffic?

Sure, I have optimized my on-page SEO, I’ve built links, written tons of blog post… I’ve done all of the stuff that most of my competition has done. But doing the same stuff as your competition isn’t enough.

My secret sauce is that I optimize for user signals.

Last week, I broke down some of the user signals Google looks at, as well as providing benchmarks to aim for if you don’t want to be penalized by Google.

If you aren’t familiar with user signals, check the article I linked to above.

So, how do you optimize for user signals?

Well, I know everyone has different types of websites, so I thought I would share the process I use to optimize NeilPatel.com.

Are you showing people what they want?

Google Analytics is an amazing tool. I’m so addicted to it that I log in at least 3 or 4 times a day. Heck, I even log in on weekends.

But here’s the thing, it only tells you half the story. It gives you numbers, but it doesn’t help you visualize what people are doing and what they aren’t.

For example, here is what my main blog page looked like according to Crazy Egg:

What’s wrong with the image?

Everyone is going to the blog to learn more about marketing. Above the fold, I have a box that showcases an SEO Analyzer. But there is one big issue: it’s barely clicked compared to the drop-down that lets you filter the blog content.

The SEO Analyzer had 128 clicks versus 359 clicks to the content filtering option.

Because you didn’t care for it as much, I removed it from the main blog page. And now when you head to the blog page you can see the filtering options above the fold.

I am looking to see what you click on and what you don’t. Simple as that.

If I keep showing you something you aren’t clicking on, I am wasting the opportunity to present you with something you do want to see. Which means I either need to adjust it or delete it.

Now, let me show you my current homepage:

What’s wrong?

Go ahead, take a guess…

Well, looking at the image you’ll notice there are tons of hot spots in the footer. That’s where the navigation is. With there being all of the clicks on the navigation, I should consider adding a navigation menu bar in the header.

Are you getting the hang of how to make your website more user-friendly? Well, let’s try another one.

Here’s an element in the sidebar of my blog posts:

That element only has 1 click. That’s terrible considering that the blog post generated 10,016 visits. And to top it off, that click came from a repeat visitor.

My goal is to convert more first-time visitors into leads, which makes up the majority of my visitors, but they are the lowest percentage of my leads.

So, what did I do? I deleted that element and you no longer see it in my sidebar.

Are you optimizing for mobile?

Let’s face it, more people are visiting your site using mobile devices than laptops or traditional computers.

If that’s not the case, it is just a matter of time.

So, have you optimized your site for mobile? And no, I’m not just talking about having a responsive design because everyone is doing that these days.

If you look at the image above, you’ll notice that I removed the image of myself and a few other elements. This helps make the loading experience faster and it helps focus people’s attention on the most important elements.

Similar to the desktop version, my mobile homepage has a 24% conversion rate. When my mobile version included a picture of me above the fold, my conversion rate dropped to 17%… hence there is no picture of me. 😉

Now, I want you to look at the mobile version of my main blog page and compare it to my homepage.

Do you see an issue?

The blog page generates a lot of clicks on the 3 bars at the top… that’s my navigation menu.

My developer accidentally removed that from the mobile homepage. That’s why the contact button in the footer of the homepage gets too many clicks.

Hopefully, that gets fixed in the next day or two as that could be negatively impacting my mobile rankings.

On top of optimizing the mobile experience, you need to ensure your website loads fast. It doesn’t matter if people are using LTE or 4G, sometimes people have terrible reception. And when they do, your website will load slow.

By optimizing it for speed, you’ll reduce the number of people who just bounce away from your site.

If you want a faster load time, follow this.

And don’t just optimize your site for speed once and forget about it. As you make changes to your site, your pagespeed score will drop, which means you’ll have to continually do it.

For example, you’ll notice I have been making a lot of change to NeilPatel.com (at least that is what the heatmaps above show). As I am making those changes, sometimes it affects my pagespeed score negatively. That means I have to go back and optimize my load time again.

A second in load time delay on average will cost you 6.8% of your revenue.

Are you focusing on helping all of your users?

Not every person who visits your website is the same.

For example, a small percentage of the people who visit NeilPatel.com work at large corporations that are publicly traded and are worth billions of dollars.

And a much larger percentage of my visitors own small and medium-sized businesses. These people are trying to figure out how to grow their traffic and revenue without spending an arm and a leg.

And the largest percentage of my visitors don’t have a website and they are trying to figure out how to get started for free.

In a nutshell, I have three groups of people who visit my website. The first group tends to turn into consulting leads for my agency, but they make up the smallest portion of my traffic.

One could say that I should only focus on helping them and ignore everyone else. But I can’t do that for a few reasons…

I started off with having practically no money and people helped me out when I couldn’t afford to pay them. I love paying it forward and helping people who can’t afford my services because I have been there, and I know what it’s like.
If I only focused on the large companies, who would link to my website and promote my content? You can bet that Microsoft isn’t going to link to me on a regular basis. If you want to generate social shares and backlinks you have to focus on the masses.
Little is the new big… if you can please the masses, they will make noise and the big players will eventually hear about you. So, don’t just treat people with deep pockets kindly, treat everyone the same and truly care about your visitors.

Once you figure out the types of people coming to your website (and if you are unsure just survey them), go above and beyond to help them out. Create different experiences for each group.

On NeilPatel.com, I’ve learned that people who work at large corporations are busy and they want to listen to marketing advice on the run. For that reason, I have the Marketing School podcast.

And a lot of beginners wanted me to break down my steps over video, so they can more easily replicate my tactics. For that reason, I create new videos 3 times per week giving marketing and business advice.

Many of you want to attend the conferences that I speak at, but can’t afford to buy a ticket. For those people, I create weekly webinars that are similar to the speeches I give at conferences.

And best of all, I know the majority of you find it hard to follow along with all of these tips as it can be overwhelming. So, I created Ubersuggest to help you out.

In other words, I try to go above and beyond for all of my visitors.

Yes, it is a lot of work, but if you want to dominate an industry it won’t happen overnight. Expect to put in a lot of time and energy.

Are you taking feedback from people?

You are going to get feedback. Whether it is in the form of email or comments, people will give you feedback.

It’s up to you if you want to listen… but if a lot of people are telling you the same thing you should consider it.

For example, I get a ton of comments on YouTube from people asking me to create videos in Hindi.

And…

Now, I am not only working on adding Hindi subtitles to my videos, but I am also working on translating my blog content to Hindi.

I’m not doing these to make more money… I’m not doing this to become popular… I’m just trying to do this to help out more people.

It’s the same reason why I have Spanish, Portuguese, and German versions of this website. I had enough requests where I pulled the trigger even though I am not focusing on generating income in those areas.

But here is the thing that most people don’t tell you about business. If you just focus on helping people and solving their problems, you’ll notice that your income will go up over time.

Businesses make money not because their goal is to make money… they make money because they are solving a problem and helping people out.

Another piece of feedback I have been getting recently is that my blog is too hard to read on mobile devices.

For that reason, I’ve assigned a task to one of my developers to fix this.

Conclusion

Traffic generation is a business. It’s not a hobby. It’s competitive, and it’s difficult to see short-term gains.

If you want to rank at the top of Google, you can’t treat your website as a hobby. You have to treat it like a business.

And similar to any business, you won’t succeed unless you pay attention to the needs of your customers. That means you have to listen to them. Figure out what they want and provide it.

That’s what Google is trying to do. They are trying to rank sites that people love at the top of their search engine. If you want to be one of those sites, then start paying attention to your visitors.

Show them what they want and go above and beyond so that they will fall in love with your website instead of your competition.

If you aren’t sure if you are making the right changes, monitor your brand queries. The more people that are searching for your brand terms on Google is a big leading indicator that people are happy with your website.

Just look at NeilPatel.com: I get over 40,000 visitors a month from people Googling variations of my name:

And I generate over 70,000 visits a month just from people searching for my free tool, Ubersuggest.

That’s how I’m continually able to make my traffic grow.

Yes, I do pay attention to what Google loves, but more importantly, I pay attention to your needs and wants.

Are you going to start optimizing your website for user signals?

The post The Secret Behind My 1,866,913 Monthly Search Visitors (It’s Not What You Think) appeared first on Neil Patel.

Local Business Transparency &amp; Empathy for the Holidays: Tips + Downloadable Checklist

Posted by on Sep 14, 2018 in SEO Articles | Comments Off on Local Business Transparency &amp; Empathy for the Holidays: Tips + Downloadable Checklist

Posted by MiriamEllis

Your local business will invest its all in stocking shelves and menus with the right goods and services in advance of the 2018 holiday season, but does your inventory include the on-and-offline experiences consumers say they want most?

Right now, a potential patron near you is having an experience that will inform their decision of whether to do business with you at year’s end, and their takeaway is largely hinging on two things: your brand’s transparency and empathy.

An excellent SproutSocial survey of 1,000 consumers found that people define transparency as being:

Open (59%)
Clear (53%)
Honest (49%)

Meanwhile, after a trying year of fake news, bad news, and privacy breaches, Americans could certainly use some empathy from brands that respect their rights, needs, aspirations, and time.

Today, let’s explore how your local brand can gift customers with both transparency and empathy before and during the holiday season, and let’s make it easy for your team with a shareable, downloadable checklist, complete with 20 tips for in-store excellence and holiday Google My Business best practices:

Grab the Holiday Checklist now!

For consumers, even the little things mean a lot

Your brother eats at that restaurant because its owner fed 10,000 meals to displaced residents during a wildfire. My sister won’t buy merchandise from that shop because their hiring practices are discriminatory. A friend was so amazed when the big brand CEO responded personally to her complaint that she’s telling all her social followers about it now.

Maybe it’s always been a national pastime for Americans to benefit one another with wisdom gained from their purchasing experiences. I own one of the first cookbooks ever published in this country and ‘tis full of wyse warnings about how to avoid “doctored” meats and grains in the marketplace. Social media has certainly amplified our voices, but it has done something else that truly does feel fresh and new. Consider SproutSocial’s findings that:

86% of Americans say transparency from businesses is more important than ever before.
40% of people who say brand transparency is more important than ever before attribute it to social media.
63% of people say CEOs who have their own social profiles are better representatives for their companies than CEOs who do not.

What were customers’ chances of seeking redress and publicity just 20 years ago if a big brand treated them poorly? Today, they can document with video, write a review, tweet to the multitudes, even get picked up by national news. They can use a search engine to dig up the truth about a company’s past and present practices. And… they can find the social profiles of a growing number of brand representatives and speak to them directly about their experiences, putting the ball in the company’s court to respond for all to see.

In other words, people increasingly assume brands should be directly accessible. That’s new!

Should this increased expectation of interactive transparency terrify businesses?

Absolutely not, if their intentions and policies are open, clear, and honest. It’s a little thing to treat a customer with fairness and regard, but its impacts in the age of social media are not small. In fact, SproutSocial found that transparent practices are golden as far as consumer loyalty is concerned:

85% of people say a business’ history of being transparent makes them more likely to give it a second chance after a bad experience.
89% of people say a business can regain their trust if it admits to a mistake and is transparent about the steps it will take to resolve the issue.

I highly recommend reading the entire SproutSocial study, and while it focuses mainly on general brands and general social media, my read of it correlated again and again to the specific scenario of local businesses. Let’s talk about this!

How transparency & empathy relate to local brands“73.8% of customers were either likely or extremely likely to continue to do business with a merchant once the complaint had been resolved.”
GetFiveStars

On the local business scene, we’re also witnessing the rising trend of consumers who expect accountability and accessibility, and who speak up when they don’t encounter it. Local businesses need to commit to openness in terms of their business practices, just as digital businesses do, but there are some special nuances at play here, too.

I can’t count the number of negative reviews I’ve read that cited inconvenience caused by local business listings containing wrong addresses and incorrect hours. These reviewers have experienced a sense of ill-usage stemming from a perceived lack of respect for their busy schedules and a lack of brand concern for their well-being. Neglected online local business information leads to neglected-feeling customers who sometimes even believe that a company is hiding the truth from them!

These are avoidable outcomes. As the above quote from a GetFiveStars survey demonstrates, local brands that fully participate in anticipating, hearing, and responding to consumer needs are rewarded with loyalty. Given this, as we begin the countdown to holiday shopping, be sure you’re fostering basic transparency and empathy with simple steps like:

Checking your core citations for accurate names, addresses, phone numbers, and other info and making necessary corrections
Updating your local business listing hours to reflect extended holiday hours and closures
Updating your website and all local landing pages to reflect this information

Next, bolster more advanced transparency by:

Using Google Posts to clearly highlight your major sale dates so people don’t feel tricked or left out
Answering all consumer questions via Google Questions & Answers in your Google Knowledge Panels
Responding swiftly to both positive and negative reviews on core platforms
Monitoring and participating on all social discussion of your brand when concerns or complaints arise, letting customers know you are accessible
Posting in-store signage directing customers to complaint phone/text hotlines

And, finally, create an empathetic rapport with customers via efforts like:

Developing and publishing a consumer-centric service policy both on your website and in signage or print materials in all of your locations
Using Google My Business attributes to let patrons know about features like wheelchair accessibility, available parking, pet-friendliness, etc.
Publishing your company giving strategies so that customers can feel spending with you supports good things — for example, X% of sales going to a local homeless shelter, children’s hospital, or other worthy cause
Creating a true welcome for all patrons, regardless of gender, identity, race, creed, or culture — for example, gender neutral bathrooms, feeding stations for mothers, fragrance-free environments for the chemically sensitive, or even a few comfortable chairs for tired shoppers to rest in

A company commitment to standards like TAGFEE coupled with a basic regard for the rights, well-being, and aspirations of customers year-round can stand a local brand in very good stead at the holidays. Sometimes it’s the intangible goods a brand stocks — like goodwill towards one’s local community — that yield a brand of loyalty nothing else can buy.

Why not organize for it, organize for the mutual benefits of business and society with a detailed, step-by-step checklist you can take to your next team meeting?:

Download the 2018 Holiday Local SEO Checklist

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

What We Learned in August 2018: The Digital Marketing Month in a Minute

Posted by on Sep 6, 2018 in SEO Articles | Comments Off on What We Learned in August 2018: The Digital Marketing Month in a Minute

What We Learned in August 2018: The Digital Marketing Month in a Minute

The average Briton spends over 2 hours online per day

Several fascinating findings were revealed in Ofcom’s annual review of the UK communications sector. Key takeaways include:

On average, people in the UK spend the equivalent of a day online per week

Facebook’s reach among 18-24 year-olds is in decline (by 4% YoY)

The smartphone video advertising market is worth over 1 billion pounds

Read the full story (Linkedin)

Google risks class action due to “surreptitious” tracking of user location

A lawsuit was filed versus Google in California after it was found that Google still tracks a smartphone’s location, even when the “Location History” setting is turned off. Some of the implications may force Google to pay a substantial fine, together with deleting some its location tracking data. A few days after the lawsuit, the Location History support page on Google’s website was changed from “with Location History off, the places you go are no longer stored” to “some location data may be saved as part of your activity on other services, like Search and Maps.”

Read the full story (Marketing Land)

Google improves accuracy of Index Coverage report in Search Console

Google has updated the Index Coverage report within Search Console for the first time since its launch. This feature was originally introduced in 2017 in order to provide information on the pages of your website that have/have not been indexed (with instructions to fix issues). According to Google, the new update will heavily impact the accuracy of report starting from August. The only drawback of this refresh is that the index coverage data for the period July 14-August 1 was not fully recorded, so it was calculated as an estimate based on the values recorded on the 1st of August.

Read the full story (Search Engine Journal)

Google confirms core algorithm update

On August 1, Google confirmed industry rumours about remarkable ranking fluxes by stating the roll out of a core algorithm update. Named “medic update” in the SEO industry due to the large number of Health and Medical sites affected, it is worth mentioning how its reach expanded beyond just this bracket but affected the broader category of YMYL (Your Money or Your Life) sites.

Read the full story (Moz)

Study reveals truth behind shopping via voice search

Despite being one of the most hyped subjects of 2018, a recent study from The Information revealed that voice search does not yet seem to be driving sales. Only 2% of the customers who use Amazon’s Alexa intelligent assistant appear to have made a purchase via voice search, and only 10% of those have made a second purchase. This is probably due to an inefficient consumer journey, device limitations or simply that people are not generally aware of the capabilities yet. How soon might this change?

Read the full story (TechCrunch)

Schema markup for datasets now supported in the SERP

Google has confirmed that dataset markups will be supported in the SERP. By doing so, Google is trying to improve the way users visualise data in the search result page, rewarding organisations that mark up datasets such as tables, CSV files, images containing data and more.

As Google stated, this new markup aims “to improve discovery of datasets from fields such as life sciences, social sciences, machine learning, civic and government data, and more”.

Read our blog to learn how to understand and implement structured data.

Read the full story (Search Engine Land)

FB launches mobile-first video creation for better ads

Facebook has rolled out a new set of tools aimed at advertisers that produce assets with mobile-first in mind, since their research has proven that “mobile-first creative has a 27 percent higher likelihood of driving brand lift compared to ads that are not optimized for mobile.”

It is now possible to add motion to existing images/videos or create videos from assets such as pictures and logos.

Read the full story (Marketing Land)

Ad spend on Instagram on the rise

Despite Facebook having experienced the largest one-day fall in American stock market history on the 26th of July, the stock is still trading at May 2018 levels. In their latest earnings report, they disclosed strong growth in ad spend on Instagram in Q2, which was 177% compared to previous year.

Instagram’s customer base has surpassed a billion active users and, according to the social media listening company socialmediabakers, brands’ Instagram profiles have a much higher user engagement compared to their Facebook equivalents. While there are challenges around the advertising opportunities in the “stories” functionality, Facebook as a whole is continuing to see their playbook work supremely well.

Read the full story (Marketing Land)

Europe to fine sites for not taking down illegal content within one hour

The EU is planning to take a stronger position on illegal or controversial material posted online, especially on social media platforms such as Facebook, Twitter and YouTube. Julian King, EU Commissioner for Security, has put forward legislation which would fine tech companies that do not remove illegal content within one hour. This follows wide reporting of a recent study which found a correlation between social media usage and hate crimes (study based on violence against refugees in Germany in 2018) though this study has received some criticism in particular related to its use of global “likes” for Nutella as a proxy for German Facebook usage.

Read the full story (Business Insider)

Twitter’s new tests: threaded replies and online status indicators

Twitter’s director of product management, Sara Haider, has posted a few screenshots that display some new features Twitter is working on to improve threaded conversations and add online status indicators. The reason behind these changes is to make Twitter “more conversational”. Neither feature appears to be groundbreaking if compared to other social media platforms: threaded replies will potentially look very similar to Facebook’s comments, while online status indicators have been used by Facebook Messenger and Instagram’s direct messages. The tech giant is currently collecting feedback before rolling out the changes.

Read the full story (Search Engine Journal)

Distilled News

We kick off the Distilled news this month with a post from Craig Bradford explaining what SEO split testing is.  The post focuses on the simple principle of split testing, how it differs from CRO and UX testing and covers a few examples to outline Distilled’s methodology. As VP in charge of Distilled’s ODN platform, Craig has a front-row seat to this hot area of SEO.

Analyst Paola Didone took her recent experiences of handling large data sets and wrote up a blog post about how to use Regex formulae in Google Sheets.

From the US, SEO Analyst Shannon Dunn suggests an easy approach to optimizing website internal linking structures.

Outside the SEO world, our People Operations Executive Kayla Walker shares her thoughts on how to give better positive feedback.

Distilled’s CEO, Will Critchlow, tried to address the industry’s confusion on the differences between URL structures and Information Architecture.

Last but not least, SearchLove London is approaching! Get your tickets here and do not miss out on the newest and hottest topics in digital marketing. At the time of writing, only 1 VIP ticket is left and you only have two weeks to take advantage of our early bird pricing (£150 off!) which is only available until the 19th of September.

Need further convincing? Will has written up 8 reasons why Search Love London is worth attending.

Do 404s Hurt SEO and Rankings?

Posted by on Sep 3, 2018 in SEO Articles | Comments Off on Do 404s Hurt SEO and Rankings?

Do 404s Hurt SEO and Rankings?

Status code 404 is probably the most common HTTP error that people encounter when they’re browsing the web. If you’ve been using the internet for over a year, chances that you haven’t encountered one yet are pretty low. They’re very common.

 

Normally, people don’t pay too much attention to them. As a user, you will get frustrated at most and hit the back button or close the tab. As a webmaster, however, more things might be at stake. Many website owners ask themselves if 404 pages hurt their SEO and rankings in any way.

 

 

 

What Is Error Code 404?

How to Add/Customize a 404 Page
How to Find 404 Errors

Do 404 Pages Hurt SEO?

What Does Google Say About 404s?
Incoming 404s
How to Fix Incoming 404s
Outgoing 404s (Includes Internal 404s)
How to Fix Outgoing 404s

Building Backlinks with the Broken Links Method

 

Keep reading, as in this article we’ll go over how 404 pages affect your website, SEO and rankings and what you can do to fix things.

 
What Is Error Code 404?

 

Error 404 is a standard HTTP status code (also called response code). When you try to access a URL or a server, the server returns a status code to indicate how the operation went. Assuming that most of the web works fine, the most common status code is 200. If you’re reading this article now, it means that your browser was able to access our server and the server found the requested resource, so it returned a 200 response code.

 

When the client can establish a connection with the server but can’t find the requested resource, it pulls out a error 404 status code. It basically means that the page or whatever resource was requested cannot be found at that particular address.

 

To check the response code of a page you can right click anywhere on the page in your browsers, hit Inspect and then go to the Network section. If you can’t see the status codes, press the F5 key, or refresh the page while the inspector is still open.

 

Chrome Inspector

 

You will usually see a bunch of status codes there. That’s because a page will load multiple resources. For example, the requested page HTML/PHP file might be found, but some image resources have been misspelled or deleted. In this case, the page document type will return a 200 response code, while the missing image resources will return 404s.

 

A 404 status code in your browser will look something like this:

 

CognitiveSEO’s 404 Page

 

As you can see, we have a document type 404 error code, which means the page doesn’t exist or wasn’t found at that address, followed by two 200 status codes that represent a couple of images that have been found.

 

Another option would be to use a tool like https://httpstatus.io/. You can insert multiple URLs and it will return you their HTTP status codes. This will only pull out the main status code of the document, excluding any other resources. You can, however add the resource URL.

 

Response code tool

 

There are other response codes there that you might have heard of. 500, 501 and 503, for example, usually indicate a server error, while 301 and 302 stand for redirects. These, along with 200 and 404, make up the most common status codes on the web.

 

The 301s you see above in the tool and browser inspector are there because I’ve entered the HTTP version instead of the HTTPS version, so a 301 is performed by our server to redirect users to the secure version of our website. I’ve decided to leave them in the screenshots, because they’re a good example of how browsers and status code tools work.

 

It is really important for a page/resource that doesn’t exist to return a 404 status code. If it returns a 200 code, Google might index it.

 

However, to combat this, Google created a “Soft 404” label. Basically, if the page states that the content isn’t found, but the HTTP status code is 200, we have a soft 404. You can find these types of errors in Google’s Search Console (Former Webmaster Tools), under Crawl Errors. If you’re already on the new version of Search Console, the easiest way is to temporarily switch to the old one.

 

 

Soft 404s aren’t real error codes. They’re just a label added by Google to signal this issue or a missing page returning a 200 code.

 
How to Add/Customize a 404 Page

 

Normally, your web server should already handle 404s properly. This means that if you try to access a URL that doesn’t exist, the server will already pull out a 404.

However, sometimes the platform might not return a 404, but a blank 200 page. Also, as you can see above, the design isn’t very pleasant and the only option given is to refresh the page… which doesn’t exist. That will keep pulling up a 404 code.

 

It’s a very good idea to have a custom web design for your 404 page. Why? Well, because it can create a better experience for your users. I mean, the experience of not finding what you’re looking for is already bad. But you can add some humor to it, at least.

 

The most important part on your 404 page is to include a CTA (call to action).

 

Without a call to action, users will most probably leave when they see a regular 404. By inserting some links to some relevant pages, you can hopefully harvest some more traffic to your main pages.

 

Take a look at our example of 404 page. Big difference, isn’t it? It might actually convince you not to be upset with us. Also, we have a pretty clear CTA that tells you to click on it. It links to the homepage. Our homepage is our hub, from which you can access the most important and relevant parts of our website.

 

cognitiveSEO’s 404 Page Design

 

However, you don’t have to limit yourself to this. You can add links to relevant category pages or other sections of your site. A search bar would also be a great idea.

 

Be creative with your 404’s web design. If it puts a smile on the users’ faces, it might even be better than if they landed on the right page. You can take a look at a few examples in this article, to get your gears spinning.

 

If you have a cool 404 design, share your website with us in the comments section and let’s have a look at it!

 

Most popular CMS (Content Management Systems), like WordPress or Joomla, already have some sort of design implemented. You can easily add a custom design using a plugin. Here’s a plugin for WordPress.

 

If you have a custom built website, then you’ll have to create a 404 template. Log into your Apache web server and create a 404.php file. If you already have one, just edit that. Sometimes, it might have the .html extension. If it doesn’t return a 404 status code, change it to .php, because we’ll need to force the HTTP request header with the proper 404 error code using some PHP.

 

<?php
header(“HTTP/1.0 404 Not Found”);
?>

 

Then, find your .htaccess file and add the following line to it:

 

ErrorDocument 404 /404.php

 

This will tell the server which page should be shown when a 404 error code is detected. If the line is already there, just modify that. That’s it. Make sure you check everything again with your browser’s inspector or with the tool mentioned above. If it returns a 404 code, you’re good to go!

 
How to Find 404 Errors

 

An easy way to find 404 errors is to log into Google’s Search Console (Former Webmaster Tools). Those are the 404s that Google see, so they’re definitely the most important ones.

 

If you see Soft 404 errors, like mentioned above in the article, you have to make sure you 404 page actually returns a 404 error code. If not, it’s a good idea to fix that.

 

There are other ways to find 404 errors. If you’re looking for broken pages on your website, which other people have linked to, you can use the cognitiveSEO Site Explorer and check the Broken Pages section.

 

Screenshot from the CognitiveSEO Tool. More details about it below, in the article.

 

If you’re looking to find broken links within your own site, or links to other websites from your website, you can use Screaming Frog. A free alternative would be Xenu Link Sleuth.

 

I’ll show you how to use these SEO tools in detail below.

 
Do 404 Pages Hurt SEO?

 

There are a lot of experts out there stating that 404s will ruin your rankings and that you should fix them as soon as possible. But, the truth is that 404s are a normal part of the web and they are actually useful.

 

Think of it. If a specific place didn’t exist, wouldn’t you rather know it than constantly be directed to other random places? It’s the same on the web. While it’s a good idea to redirect an old page that’s been deleted to a new, relevant page, it’s not such a good idea to redirect every 404 to your homepage, for example. However, I’ve seen some sites redirect their users after a countdown timer, which I thought was a good idea.

 

In theory, 404s have an impact on rankings. But not the rankings of a whole site. If a page returns a 404 error code, it means it doesn’t exist, so Google and other search engines will not index it. Pretty simple, right? What can I say… make sure your pages exist if you want them to rank (ba dum ts).

 

So what’s all the hype about 404s? Well, obviously, having thousands and thousands of 404 pages can impact your website overall.

 

However, it’s not so much the actual 404 pages that hurt SEO, but the links that contain URLs pointing to the 404s.

 

You see, these links create a bad experience. They’re called broken links. If there were no broken links, there wouldn’t even be any 404 errors. In fact, you could say that there are an infinity of 404s, right? Just add a slash after your domain, type something random and hit enter. 404. But if search engines can’t find any links pointing to 404s, the 404s are… double non-existent. Because they already don’t exist… And then they don’t exist again. I hope you get the point.

 

I’ll explain everything in more detail soon, so keep reading.

 
What Does Google Say About 404s?

 

Google has always pointed out that 404s are normal. They also seem to be pretty forgiving with them. I mean, that’s natural, considering that they have 404s of their own:

 

 

In fact they’ve pointed these things out in an article from 2011 and also in this more recently posted video:

 

 

There’s also this source that also treats the issue:

 

 

If you want to read more on this, visit this link, then scroll to the bottom and open the Common URL errors dropdown.

 

However, let’s explain everything in more detail. People often forget that there two types of 404 pages. The ones on your site and the ones on other people’s website. They can both affect your site, but the ones that affect you most are the ones on other people’s websites.

 

“What? Other websites’s 404s can impact my website?”

 

Yes, that’s right. If your website links to other websites that return a 404, it can negatively impact its rankings. Remember, it’s not so much the 404s that cause the trouble, but the links to the 404s. No links to 404s, no 404s. So you’d better not create links to 404s.

 
Incoming 404s

 

Incoming 404s are URLs from other websites that point to your website, but return a 404. Incoming 404s are not always easy to fix. That’s because you can’t change the URLs on other websites, if you don’t own them. However, there are workarounds, such as 301 redirects. That should be kept as a last option, in case you cannot fix the URL.

 

These don’t really affect you negatively. I mean, why should you be punished? Maybe someone misspelled it, or maybe you deleted the page because it’s no longer useful. Should you be punished for that? Common sense kind of says that you shouldn’t and Google agrees.

 

However, this does affect your traffic, as when someone links to you, it sends you visitors. This might lead to bad user experience on your side as well. You can’t always change the actions of others, but you can adapt to them and you can definitely control yours.

 

Most webmasters will be glad to fix a 404, because they know it hurts their website. By sending their users to a location that doesn’t exist, they’re creating a bad experience.

 

If you’ve deleted a page with backlinks pointing to it (although it’s not a good idea to delete such a page) you must make sure you have a 301 redirect set up. If not, all the link equity from the backlinks will be lost.

 

If you don’t redirect backlinks to broken pages on your website to relevant locations, you won’t be penalized or anything, but you will miss out on the link equity.

 

A 301 is mandatory, because often you won’t be able to change all the backlinks. Let’s take social media, for example. On a social media platform like Facebook, one post with a broken link could be shared thousands of times. Good luck fixing all of them!

 

You could also link to your own website with a 404, from your own website. Broken internal linking is common on big websites with thousands of pages or shops with dynamic URLs and filters. Maybe you’ve removed a product, but someone linked to it in a comment on your blog. Maybe you had a static menu somewhere with some dynamic filters that don’t exist anymore. The possibilities are endless.

 
How to Fix Incoming 404s

 

Fixing incoming 404 URLs isn’t always very easy. That’s because you’re not in full control. If someone misspells a link pointing to your website, you’ll have to convince them to fix it. A good alternative to this is to redirect that broken link to the right resource. However, some equity can be lost in the process, so it’s great if you can get them to change the link. Nevertheless, the 301 is mandatory, just to make sure.

 

If you’ve deleted a page, you can let those webmasters know that link to it. Keep in mind that they might not like this and decide to link to another resource. That’s why you have to make sure that the new resource is their best option.

 

To find incoming broken links, you can use cognitiveSEO’s Site Explorer. Type in your website, hit enter, then go to the Broken Pages tab.

 

 

If you click the blue line, you can see what links are pointing to your 404 URL. The green line represents the number of total domains pointing to it. Some domains might link to your broken page multiple times. For example, the second row shows 33 links coming from 12 domains. The green bar is bigger because the ratio is represented vertically (the third green bar is 4 times smaller than the second green bar).

 

Then, unfortunately, the best method is to contact the owners of the domains and politely point out that there has been a mistake. Show them the correct/new resource and let them know about the possibility of creating a bad experience for their users when linking to a broken page. Most of them should be happy to comply.

 

Whether you get them to link to the right page or not, it’s a good idea to redirect the broken page to a relevant location. I repeat, a relevant location. Don’t randomly redirect pages or bulk redirect them to your homepage.

 

It’s also a good idea to do a background check on the domains before redirecting your URLs. Some of them might be spam and you might want to add them to the disavow list.

 

Remember, 404s should generally stay 404. We only redirect them when they get traffic or have backlinks pointing to them. If you change a URL or delete a page and nobody links to it or it gets absolutely no traffic (check with Google Analytics), it’s perfectly fine for it to return a 404.

 
Outgoing 404s (Includes Internal 404s)

 

Outgoing 404s are a lot easier to fix because you have complete control over them. That’s because they’re found on your own website. You’re the one linking to them. Sure, someone might have screwed you over by deleting a page or changing its URL, but you’re still responsible for the quality of your own website.

 

The only type of 404 links that really hurt your website are the ones that are on it. When you add a link from your website to another website, you have to make sure that URL actually exists or that you don’t misspell it. You might also have internal links that are broken. Similar to shooting yourself in the foot.

 

Broken links create bad user experience and we all know that Google (and probably other search engines as well) cares about user experience.

 

Google crawls the web by following links from one site to another, so if you tell Google “Hey man, check out this link!” only for it to find a dead end, I’m pretty sure whom Google’s going to be mad about.

 

That’s why, from time to time, it’s a good idea to check if you’re not linking out to 404s. You never know when one shows up. The best way to do it is to use some software that crawls your website. 

 
How to Fix Outgoing 404s

 

Fixing outgoing 404s is easier because you have full control over them. They’re on your site, so you can change them.

 

To find them, you can use either Screaming Frog or Xenu Link Sleuth. I know Xenu looks shady, but it’s safe, it works and it’s free.

 

If you have a Screaming Frog subscription, go ahead and crawl your website. The free version supports 500 URLs, but a new website with under 500 URLs rarely has broken links. After the crawl is finished (it might take hours or even days for big sites), go check the Response Code tab and then filter it by searching for 404. At the bottom, go to the Inlinks section to find the location of the broken URL on your website.

 

 

 

Another way to do it is to go to the External tab, but there you won’t find the internal broken links. To find its location, go to Inlinks, again.

 

 

If you want to use a free alternative, go for Xenu. However, things are a little more complicated with Xenu. Xenu doesn’t point out pretty much anything else other than URLs and their status codes. It also doesn’t always go through 301s to crawl your entire site, so you’ll have to specify the correct version of your site, be it HTTP or HTTPS, www or non-www.

 

To begin the crawl, go to File -> Check URL. Then enter your website’s correct main address and hit OK. Make sure that the Check External Links box is checked.

 

 

After the crawl is done, you can sort the list by status codes. However, a better way is to go to View and select Show Broken Links Only. After that, to view the location of the broken link on your site, you’ll have to right click and hit URL properties. You’ll find all the pages that link to it.

 

Unfortunately, I haven’t found a proper way of exporting the link locations, so you’re stuck with right clicking each link manually.

 

After you’ve located the links with either Xenu or Screaming Frog, edit them in your admin section to point them to a working URL. You can also just 301 them, but some link equity will be lost so the best thing to do is to fix the links themselves. Just remember that the 301 redirect is mandatory.

 
Building Links with the Broken Links Method

 

These 404s, always a struggle, aren’t they? That’s true, but there’s also a very cool thing about 404s. The fact that you can exploit them to build new links.

 

Sounds good, right? Let me explain.

 

Wouldn’t you like someone to point out to you a broken link on your site? I’d certainly like that. What if then, they’d even go further as to give you a new resource to link to, one even better than the one you were linking to before? Would you consider linking to it?

 

Well, if you find some relevant sites that link to broken pages, you might as well do them a favor and let them know. And how can you do that, exactly? Well, you can use the Broken Pages section of CognitiveSEO’s Site Explorer, of course.

 

 

However, you’ll also need some great content to pitch them if you want this to work. If you don’t have that, they won’t bother linking to you. They’ll just remove the broken link and thank you for pointing it out. So, if you aren’t already working on a great content creation strategy, you should get started.

 

The secret to broken link building, however, is to have awesome content that they can link to.

 

Once you find a website linking to a broken page, all you have to do is email them something like this:

 

Hey there, I was checking your site and followed a link but it leads to a page that doesn’t exist. You might want to fix that, as it creates a bad experience for your users. Also, if you find it appropriate, I have a pretty good resource on that topic you could link to. Let me know if you like it.

 

I’d go one step further and actually search the site which has been linked to for the resource. If it’s there, at a new location, point that out before your article. You’ll have more chances of them trusting you this way. Your article will be an alternative. Also, if the old resource is worse, they’ll be able to compare them and see the difference.

 

The broken link method is one of the best SEO strategies for link building. If you want to learn more about this method and how to apply it effectively, you can read this awesome article about broken pages link building technique.

 

Conclusion

 

So, if you were wondering if 404 errors hurt SEO, now you know the answer. Anyway, let me summarize it:

 

404 error pages don’t really hurt your SEO, but there’s definitely a lot you can miss out if you don’t fix them. If you have backlinks pointing to pages on your website that return a 404, try to fix those backlinks and 301 redirect your broken URLs to relevant location. If you have links on your site that point to broken pages, make sure you fix those as soon as possible, to maximize the link equity flow and UX.

 

What are your experiences with 404 pages? Do you constantly check your website for 404s? Have you ever used the broken pages link building method mentioned above? Let us know in the comments section!

The post Do 404s Hurt SEO and Rankings? appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

4 warning signs AdSense is ruining your contextual advertising strategy

Posted by on Aug 30, 2018 in SEO Articles | Comments Off on 4 warning signs AdSense is ruining your contextual advertising strategy

4 warning signs AdSense is ruining your contextual advertising strategy

In the dark ages of the SEO era, when bloggers and webmasters were still familiarizing themselves with the process and its functionality, certain tactics and strategies had become industry standards.

The era I’m talking about is the one where Google AdSense was heavily built into the foundation of a blogger’s strategy. The “legacy” tactics associated with this approach can still be found in the way modern publishers think about SEO and branding strategy. However, AdSense’s limited customizability can hold back publishers. This needs be addressed and rooted out.

Before assuming AdSense is the best monetization partner for you, consider these four warning signs. If you’re guilty of practicing any of these points, it’s time you re-evaluated your monetization partner and strategy.

1. You haven’t considered other platforms

It’s no secret that AdSense as a standalone monetization stream isn’t enough to earn substantial revenue. Most solopreneurs that still operate in the “blogosphere” have understood for years that it is important to branch out and diversify revenue streams. So there’s nothing revolutionary about this concept.

Most of the focus on diversification has been on developing products to sell, with eBooks being a gold standard. This is great advice, even if it can become a bit boilerplate at times. But we’re not talking about selling products today. We’re talking about contextual advertising, which means placing relevant ads on your site that fit in with the content of your page. When it comes to contextual advertising, too many people still aren’t considering their other options.

Media.net, the second largest contextual advertising business worldwide by revenue, is a good place to start experimenting. The platform uses machine-learning algorithms to predict user intent, based on the content of your pages, and serves ads based on the associated keywords. With Media.net you get exclusive access to Yahoo! Bing’s $6 billion worth of search demand. This allows you to leverage quality advertisers even if you are in a smaller niche.

Performance is obviously different for every site, but Perrin Carrell of AuthorityHacker claims Media.net ads earns them 5x as much as AdSense ads, and Jon Dykstra of FatStacksBlog reported that some Media.net ad placements were earning more revenue than all other ad networks.

One of the biggest advantages of Media.net ads is that their ads are heavily customizable. Sizes and designs can be designed to match your site so that they are native to your content and inline with your branding, resulting in higher engagement and revenue. Native ads are a great way to offer your readers an uninterrupted experience since these ads look like a natural extension of your website. In fact, these ads are also mobile responsive, which means more revenue for you.

Media.net Native Ad Unit

 

Media.net Contextual Ad Unit

From there, you can also consider ad servers like the Google Ad Manager (formerly DoubleClick For Publishers) and OpenX. Ad server platforms like these give publishers great control over ads, including the ability to set preferred deals with a CPM floor, and the option to interact directly with the ad marketplace.

In short, if AdSense is the only ad platform you’ve experimented with, you are missing out on great revenue-generating opportunities.

2. You are picking topics based on AdWords keyword bids

The SEO industry grew up on the Google AdWords Keyword Tool, and its successor, the Keyword Planner. One trend, born in the age of “Made For AdSense” (MFA) blogs and microsites, was to use the Keyword Planner to discover topics to write about based on AdWords bid prices.

This approach was never a good long-term strategy. A blog based on topics chosen to optimize revenue according to this approach often leads to a poorly branded content site that doesn’t naturally adapt to the needs of its audience. The obviously commercial intent of the topics chosen puts a hard ceiling on the size of your recurring audience.

Search engines like sites that build recurring audiences. They earn higher click through rates from the search engines, which Googlers have admitted are used to judge SERP quality.

Modern content creators need to select topics based on what will most successfully help them cultivate an audience. This means identifying keywords that address specific problems you can help users solve. 

You do not find these topics by typing generic industry keywords into the Keyword Planner. You find them by identifying your audience and the platforms they frequent, the kind of questions they ask one another, or even asking them directly what they are most frustrated with, and looking for satisfaction gaps in the answers to those questions. Only then should you turn to the Keyword Planner to start looking for the best keywords to represent your solutions.

The goal isn’t to target valuable keywords, but to target valuable audiences. This is a crucial difference that should guide your strategy at a more foundational level.

3. Your ad placement is based on MFA “best practices” instead of testing

“Best practices” rooted in old school MFA thinking prevent you from building your own monetization strategy from the ground up. They can also hurt your rankings in the search results.

Damaged Rankings

Old school, “gray hat” MFA tactics like trying to place ads where they will be confused for navigation rather than placing them depending on your layout and content were never good branding strategies, and simply don’t work anymore.

Google’s internal quality rater guidelines explicitly state that sites should never disguise advertisements as the main content or navigation of the site, and if they do they will receive the “lowest” quality rating. Likewise for ads that make the main content unreadable, as well as ads that are distracting because they are too shocking.

Bad Strategy

Even advice that seems innocuous and doesn’t violate search guidelines can be harmful.

Recommendations like “place your ad in the sidebar,” “place it within your content just above the fold,” or “use the 300×250 ad size” are often unhelpful and counterproductive. Advice this specific shouldn’t be given without context, because ads should be placed in a way that fits your site design.

Suggestions like these are always hypotheses that you should test, not rules written in stone. Run your own A/B tests to find out what works for you.

We recommend Google Analytics Experiments for your testing because their Bayesian statistical methods make it easier to interpret results, because they are free, and because the data is as fully incorporated into Google Analytics as possible.

4. You are not partnering with sponsors

This is one of the biggest opportunities you miss out on if you operate on an AdSense-focused monetization strategy. When you work with sponsors, you can work advertisements entirely into the main content of your blog post, or host articles that are sponsored content created by sponsors themselves. You can negotiate deals that will guarantee a certain level of revenue, which is not always possible using programmatic advertising.

You can collaborate with sponsors on innovative campaigns that will earn the sponsor far more attention than traditional ads, which naturally means they will be willing to spend more. Innovative approaches can also result in more exposure not just for your sponsor, but even for your own brand.

It also lets you monetize on channels where AdSense won’t, such as your social media platforms.

If you aren’t reaching out to potential sponsors to discuss possibilities like these, you are missing out on substantial revenue.

Conclusion

AdSense should not be thought of as central to your contextual advertising strategy, or worse, the foundation of how you approach brand building. Diversify your advertising platforms, migrate your market research outside of AdSense’s native tools, and rely on your own testing strategies. Let your brand drive your monetization strategy, not the other way around.

Manish Dudharejia is the president and founder of E2M Solutions Inc, a San Diego based digital agency that specializes in website design & development and ecommerce SEO. Follow him on Twitter.

Structured Data Can = MehSEO

Posted by on Aug 30, 2018 in SEO Articles | Comments Off on Structured Data Can = MehSEO

Structured Data Can = MehSEO

In 2011, Google, Bing & Yahoo announced Schema.org which got SEOs all excited to start marking up website content to turn it into “structured data.” The benefit would be that search engines would be more certain that a text string of numbers was in fact a phone number, or at least they would be more certain that you wanted them to think it was phone number. The search engines could then turn the structured data into eye-catching fripperies designed to seduce searchers into surrendering their clicks and revenue to your fabulously marked-up site (aka “Rich Results).

It also could help your fridge talk to your Tesla.

So pretty much every SEO marked-up their audits and conference presentations with recommendations to mark up all the things. LSG was no exception. And we have seen it work some nice SEO miracles.

There was the ecommerce site that lost all its product review stars until we reconfigured the markup. There was the yellow pages site that got a spammy structured data manual action for merging a partner’s review feed into its own. There is the software vendor and its clients that (still!) violate Google’s structured data guidelines and get away with it. There have been countless Knowledge Panels that have needed the tweaking one can only get from a perfectly implemented https://schema.org/logo.

But structured data is not a killer SEO strategy for all situations, and it’s important that SEOs and clients understand that often it’s more of a future-proofing game than an actual near-term traffic or money-generator. For example, let’s take this UGC site that generated about 22 million clicks from Google over the past three months and see how many clicks are reported as coming from “Rich Results” in Google Search Console:

So less than one-half of one-half of 1% of clicks came from a “Rich Result.” Not particularly impressive.

The good news is that Google is in fact using the structured markup. We can see evidence of it in the SERPs. But it’s likely the content of this site doesn’t lend itself to eye-popping featured snippets. For example, many of the Rich Results appear to just be bolded words that appear in the URL snippets in the SERPs, kind of like this:

It also may just take time before Google trusts your markup.

So before you drop everything and prioritize structured markup, you may want to consult Google’s Structured Data Gallery to get an idea of which types of content Google is pushing to markup. You also should check the SERPs to see what your competitors are doing in this area and how their marked-up content is being displayed. This should give you a good idea of what the potential is for your site.

And remember,”you can mark-up anything, but you can’t mark-up everything…” – Tony Robbins?

The post Structured Data Can = MehSEO appeared first on Local SEO Guide.

Getting personal with SEO: how to use search behavior to transform your campaign

Posted by on Aug 29, 2018 in SEO Articles | Comments Off on Getting personal with SEO: how to use search behavior to transform your campaign

In order to meet the needs of today’s consumers and a more intelligent digital market, creating value in optimization campaigns requires innovative thinking and a personalized approach. Adverts, landing pages, and on-site messages that feel tailor-made are becoming the norm for many brands, contributing to higher response rates, visibility, and value.

Arguably, in today’s post-truth era, creating a personal message that can tap into the emotions and needs of a consumer is exactly the direction in which we will continue to progress. It’s also likely that in the near future, this will become the only way that optimization campaigns can be successful.

Anyone can enhance and deliver stronger campaigns by picking insights from search behaviors and using them to directly address your digital customers. But how can you maximize the effectiveness of doing this? Using Delete’s European Search Award-winning campaign for Leeds Beckett University as a case study, this article will take an in-depth look into profiling and understanding your browsers to attract and convert new customers.

Why utilizing user search behavior is necessary in campaigns

From Google’s personalized search algorithm that was launched in 2005, to 2015’s RankBrain, search results have consistently shifted towards searcher satisfaction rather than the needs of a webmaster or business. As users began to demand more intelligent, considered content (keyword stuffing is now a definitive no-go), we’ve had to adapt by creating engaging content that is authoritative in terms of knowledge and information.

There are clear signs that behavior signals are on Google’s radar. Google now elevates the results that it considers to be more relevant to a searcher based on profile information that it gathers about them. So, when it comes to creating your own outreach campaigns, it is only logical to harness and use this profile information to influence post-click user experience.

Harness search behavior to create customer profiles and develop positive relationships

Using search behavior information and user profiles is important because of the phenomenal results you can achieve, particularly at a time when advertising is becoming more challenging by the day.

Splitting users into customer profiles is a method that will enable the creation of targeted, tailor-made advertising and content that is more likely to result in conversions. There are a variety of ways that user behavior can be tracked and profiled, varying from more in-depth and specific methods to quicker, cheaper options that may benefit a brand looking to boost a current campaign or alter the way that their advertising is completed in-house. Not only will customer profiles ensure that only relevant content is delivered to users, but it can also contribute to the development of customer trust and loyalty.

Delete’s Leeds Beckett campaign saw the development of delivering tailor-made landing pages and adverts to international students in an aim to encourage verbal contact with the university as early in the cycle as possible and to make an easier, less daunting application process. By using geographical data, we were able to create customer profiles for international students, which then meant we were able to serve carefully selected imagery to visitors from China, India, and Europe, as well as clear and relevant calls to action.

Splitting apart potential customers by geography, interests, and type of content consumption on the site is the most efficient way to create customer profiles. It can be done through both organic searches and paid searches, with both outlets leading to different customer bases across a variety of platforms. Leveraging existing data is also a practical and simple solution that will help develop stronger relationships with a current customer base. You can then lead users to dynamic pages and imagery that are reflective of organic searches, geolocation, and paid advertising clicks.

The value in creating customer profiles from paid or organic searches

Advertisers now have to look for ways to outsmart the competition. Unfortunately, managing a campaign well is no longer anything special, but a default expectation. Try going beyond the boundaries of just “best practice” SEO or PPC and show real innovation and creativity; it will really pay off.

Using data from users’ organic searches enables a valuable customer profile of people who are already invested or interested in a brand. When it comes to applying this behavior to SEO, it results in the opportunity to tap into a receptive audience who will benefit from additional information and who may have abandoned conversion if they hadn’t been given access to the information that they were looking for.

Delete’s campaign with Leeds Beckett University experienced phenomenal results. For a typical budget for a campaign of its caliber, we were able to generate approximately £6.9 million revenue in one year and an ROI of 10,403.00%. The use of customer profiles undoubtedly played a large part in this.

Use geographical data to deliver direct and relevant information

In an aim to target potential customers and increase conversion, Delete used an innovative method of developing a live map that would plot the addresses of past enrollments, prospects gathered at educational fairs, and open day registrations. This completely changed their geographical targeting in all marketing campaigns, resulting in a 691.67% increase in traffic to the clearing section.

By creating customer profiles based on geography, there is the opportunity to attract and cater to people who may have less initial interest as well as reduce abandoned conversions due to unrelated content. As well as this, it can encourage behaviors that are natural and reflective of the user with a lower cost per click and a higher volume of leads.

Revolutionize the way you use paid and organic search behavior for remarkable results

To maximize results in a marketing campaign, create dynamic landing pages and website experience based on recorded search behaviors and the profiles that can be subsequently created using this information. When it comes to paid ads, you can pass targeting and settings to a website and use this information to personalize the website.

With organic listings, you can glean user interests from entrance pages from organic search and what users do once they are on a page. If you create your landing pages right, so that they target the desired keywords well, you can also make assumptions from people landing on these pages from organic search and then interact with them in whichever way you want, even targeting certain interests.

For example, in our campaign with Leeds Beckett, if a user indicated an interest in a Civil Engineering degree (by clicking on a PPC ad from Civil Engineering for Undergraduates ad group), the landing page or the whole website would start surfacing an image of a work placement student standing on a building site, wearing a hard hat and high visibility jacket. This brings the individual student’s interests to the surface, highlighting the best relevant features that the university has on offer. Ultimately the aim here is to shorten the user journey and increase the chance of a conversion.

This can be applied to almost any marketing area or industry, and it will transform the way that your users are able to engage with your content.

The Long-Term Link Acquisition Value of Content Marketing

Posted by on Aug 29, 2018 in SEO Articles | Comments Off on The Long-Term Link Acquisition Value of Content Marketing

The Long-Term Link Acquisition Value of Content Marketing

Posted by KristinTynski

Recently, new internal analysis of our work here at Fractl has yielded a fascinating finding:

Content marketing that generates mainstream press is likely 2X as effective as originally thought. Additionally, the long-term ROI is potentially many times higher than previously reported.

I’ll caveat that by saying this applies only to content that can generate mainstream press attention. At Fractl, this is our primary focus as a content marketing agency. Our team, our process, and our research are all structured around figuring out ways to maximize the newsworthiness and promotional success of the content we create on behalf of our clients.

Though data-driven content marketing paired with digital PR is on the rise, there is still a general lack of understanding around the long-term value of any individual content execution. In this exploration, we sought to answer the question: What link value does a successful campaign drive over the long term? What we found was surprising and strongly reiterated our conviction that this style of data-driven content and digital PR yields some of the highest possible ROI for link building and SEO.

To better understand this full value, we wanted to look at the long-term accumulation of the two types of links on which we report:

Direct links from publishers to our client’s content on their domain
Secondary links that link to the story the publisher wrote about our client’s content

While direct links are most important, secondary links often provide significant additional pass-through authority and can often be reclaimed through additional outreach and converted into direct do-follow links (something we have a team dedicated to doing at Fractl).

Below is a visualization of the way our content promotion process works:

So how exactly do direct links and secondary links accumulate over time?

To understand this, we did a full audit of four successful campaigns from 2015 and 2016 through today. Having a few years of aggregation gave us an initial benchmark for how links accumulate over time for general interest content that is relatively evergreen.

We profiled four campaigns:

Perceptions of Perfection Across Borders
America’s Most P.C. and Prejudiced Places
Reverse-Photoshopping Video Game Characters
Water Bottle Germs Revealed

The first view we looked at was direct links, or links pointing directly to the client blog posts hosting the content we’ve created on their behalf.

There is a good deal of variability between campaigns, but we see a few interesting general trends that show up in all of the examples in the rest of this article:

Both direct and secondary links will accumulate in a few predictable ways:
A large initial spike with a smooth decline
A buildup to a large spike with a smooth decline
Multiple spikes of varying size

Roughly 50% of the total volume of links that will be built will accumulate in the first 30 days. The other 50% will accumulate over the following two years and beyond.
A small subset of direct links will generate their own large spikes of secondary links.

We’ll now take a look at some specific results. Let’s start by looking at direct links (pickups that link directly back to our client’s site or landing page).

The typical result: A large initial spike with consistent accumulation over time

This campaign, featuring artistic imaginings of what bodies in video games might look like with normal BMI/body sizes, shows the most typical pattern we witnessed, with a very large initial spike and a relatively smooth decline in link acquisition over the first month.

After the first month, long-term new direct link acquisition continued for more than two years (and is still going today!).

The less common result: Slow draw up to a major spike

In this example, you can see that sometimes it takes a few days or even weeks to see the initial pickup spike and subsequent primary syndication. In the case of this campaign, we saw a slow buildup to the pinnacle at about a week from the first pickup (exclusive), with a gradual decline over the following two weeks.

“These initial stories were then used as fodder or inspiration for stories written months later by other publications.”

Zooming out to a month-over-month view, we can see resurgences in pickups happening at unpredictable intervals every few months or so. These spikes continued up until today with relative consistency. This happened as some of the stories written during the initial spike began to rank well in Google. These initial stories were then used as fodder or inspiration for stories written months later by other publications. For evergreen topics such as body image (as was the case in this campaign), you will also see writers and editors cycle in and out of writing about these topics as they trend in the public zeitgeist, leading to these unpredictable yet very welcomed resurgences in new links.

Least common result: Multiple spikes in the first few weeks

The third pattern we observed was seen on a campaign we executed examining hate speech on Twitter. In this case, we saw multiple spikes during this early period, corresponding to syndications on other mainstream publications that then sparked their own downstream syndications and individual virality.

Zooming out, we saw a similar result as the other examples, with multiple smaller spikes more within the first year and less frequently in the following two years. Each of these bumps is associated with the story resurfacing organically on new publications (usually a writer stumbling on coverage of the content during the initial phase of popularity).

Long-term resurgences

Finally, in our fourth example that looked at germs on water bottles, we saw a fascinating phenomenon happen beyond the first month where there was a very significant secondary spike.

This spike represents syndication across (all or most) of the iHeartRadio network. As this example demonstrates, it isn’t wholly unusual to see large-scale networks pick up content even a year or later that rival or even exceed the initial month’s result.

Aggregate trends
“50% of the total links acquired happened in the first month, and the other 50% were acquired in the following two to three years.”

When we looked at direct links back to all four campaigns together, we saw the common progression of link acquisition over time. The chart below shows the distribution of new links acquired over two years. We saw a pretty classic long tail distribution here, where 50% of the total links acquired happened in the first month, and the other 50% were acquired in the following two to three years.

“If direct links are the cake, secondary links are the icing, and both accumulate substantially over time.”

Links generated directly to the blog posts/landing pages of the content we’ve created on our clients’ behalf are only really a part of the story. When a campaign garners mainstream press attention, the press stories can often go mildly viral, generating large numbers of syndications and links to these stories themselves. We track these secondary links and reach out to the writers of these stories to try and get link attributions to the primary source (our clients’ blog posts or landing pages where the story/study/content lives).

These types of links also follow a similar pattern over time to direct links. Below are the publishing dates of these secondary links as they were found over time. Their over-time distribution follows the same pattern, with 50% of results being realized within the first month and the following 50% of the value coming over the next two to three years.

The value in the long tail

By looking at multi-year direct and secondary links built to successful content marketing campaigns, it becomes apparent that the total number of links acquired during the first month is really only about half the story.

For campaigns that garner initial mainstream pickups, there is often a multi-year long tail of links that are built organically without any additional or future promotions work beyond the first month. While this long-term value is not something we report on or charge our clients for explicitly, it is extremely important to understand as a part of a larger calculus when trying to decide if doing content marketing with the goal of press acquisition is right for your needs.

Cost-per-link (a typical way to measure ROI of such campaigns) will halve if links built are measured over these longer periods — moving a project you perhaps considered a marginal success at one month to a major success at one year.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!