Blog

Announcing Full-Funnel Testing – testing SEO and CRO at the same time

Posted by on Oct 24, 2018 in SEO Articles | Comments Off on Announcing Full-Funnel Testing – testing SEO and CRO at the same time

Announcing Full-Funnel Testing – testing SEO and CRO at the same time

Until now it’s not been possible to measure the impact of SEO and CRO at the same time. Today we’re proud to announce a new feature of Distilled’s Optimisation Delivery Network that we’re calling full funnel testing.

Our ODN platform launched with a focus on SEO testing. You have probably thought about this by comparing it to tools like Optimizely that allow you to do CRO testing. If you want to know more about how SEO testing works and how it’s different to CRO, you can read more in this post on what is SEO testing.

The trouble with just using one or the other is you don’t have any insight into how they impact each other.

That’s a big problem because we know from our testing that a lot of SEO changes impact conversion rate and a lot of CRO changes (even when they increase conversion rate) can negatively impact organic traffic. If you haven’t read it already, you should check out Will’s blog post on the impact of rolling out negative SEO changes but here’s an example of when it goes wrong. This chart shows the search impact of a suggested CRO change on SEO. It decreased organic traffic by 25%.

For that reason, we see the relationship between SEO and CRO like this: 

We saw a need to be able to measure SEO and CRO at the same time. For the last few months, we’ve been running a beta version for some of our clients of what we are calling “full-funnel testing”. Today we’re opening that feature up to everyone and we’d like to show you how it works.

How does it work?

Let’s look at CRO first. To run a CRO experiment, we cookie users based on the landing page design that they arrive on, they’ll then always see that version when they move between pages.

The result is we know the impact on conversion rate, but we don’t know the impact on SEO.

When we do pure SEO testing, we split pages, not users and look at the different impacts on search traffic to the control and variant pages:

The result of this framework is that we know the impact on SEO but we don’t know the impact on conversion rate:

A new framework – Full-funnel testing

With full funnel testing, the site is set up initially in the same way as in the pure SEO testing scenario – and then when someone arrives on a landing page, the SEO testing part of the experiment is complete:

We can then pivot into a CRO experiment by dropping a cookie for that user to make sure they see the same template that they first landed on when moving between pages:

Note that, having landed on the Unicorns page initially, they now see the “A” template version on all subsequent pageviews even on pages like Cats and Badgers that would be set up with the “B” template for anyone landing directly on them as a new visitor:

The result is that we are able to measure the impact of changes on SEO and CRO at the same time.

Thanks for making it this far, you can expect to hear more about this as we get more examples of full-funnel tests and start to share what we learn. If you’d like to know more or see a demo, reach out to us here.

Google + Is Shutting Down. How Does It Impact Your SEO?

Posted by on Oct 17, 2018 in SEO Articles | Comments Off on Google + Is Shutting Down. How Does It Impact Your SEO?

Google + Is Shutting Down. How Does It Impact Your SEO?

Google +? Have you forgotten about it, too? While many of you seem to have been disregarding Google + over the past few years, it was there. But now, as the latest news confirms, we’ve found out that it will be shut down after user information was exposed. It is no shock for some to hear that it will be locked down, while for others it is sad news.

 

All in all, the news might affect lots of business owners. We’ve talked on our blog before, on multiple occasions, about the effect of social signals for your website. Now, it is time to see what has actually happened and what’s next.

 

 

All the fuss was powered by the news saying that Google + will be shut down after user data leak. Sources say over 500,000 users are affected because Google had leaked private information to third-party app developers between 2015 and March 2018. Google did not tell about the security breach they had in March 2018 and that came backstabbing them.

 

The problem is even more serious if we recall the similar situation that happened earlier this year, when Facebook acknowledged that Cambridge Analytica, a British research organization that had performed work for the Trump campaign, had inadequately got access to the personal information of up to 87 million Facebook users.

 

How Google’s Officials Treat the News
Will Shutting Down Google + Affect Your Business?
What’s Next in Your Social Media Strategy

 
1. How Google’s Officials Treated the News

The decision to stay quiet drew the attention of the cybersecurity community because the laws in California and Europe say a company must disclose a security episode.

 

On the other hand, Google’s decision to stay quiet was taken because it didn’t interfere with the company’s “Privacy & Data Protection Office” and it was not legal to report it. The giant mentioned in a blog post that nobody gained access to user information.

 

We found no evidence that any developer was aware of this bug, or abusing the API, and we found no evidence that any Profile data was misused.

Google

 

 

Applications made by other companies had access to Profile fields that were shared with the user, but not marked as public. Google’s officials said that:

 

This data is limited to static, optional Google+ Profile fields including name, email address, occupation, gender and age. It does not include any other data you may have posted or connected to Google+ or any other service, like Google+ posts, messages, Google account data, phone numbers or G Suite content.

Google

 

 

So apps did not have access to phone numbers, messages, Google Plus posts or data from other Google accounts. And they didn’t find any evidence that outside developers found the breach and the issue was fixed in March.

 

The funny thing in this situation is the fact that Google’s top managers stopped posting on Google + up to 3 years ago, in 2015. Which is kind of strange, because a Wall Street Journal report showed Google exposed user data around that date, as mentioned previously.

 

Larry Page, the co-founder of Google, had his last post on 21 August 2015, as you can see in the next screenshot. It seems like he gave up on Google + a long time ago.

 

 

Since 30 June 2011, Larry had posted 147 times on his Google plus account. And that’s so little! If you take into consideration that until 2015 (when his last post was made) he had 147 posts, we could make a large assumption and say it’s like he posted 3 times per month.  

 

Let’s take a look at the second co-founder, Sergey Brin’s Google + page. His last post isn’t published so long ago – on 9 September 2017. From all Google management, he’s the one that used Google + for a longer time. His last post was a photo within the Ragged Islands in the Bahamas, made just a few hours before it bore the brunt of Hurricane Irma.

 

 

A few years ago, Sergey Brin said that he is not a very social person and hadn’t spent much time on Facebook and Twitter, Google +’s competition. We discovered that more recently he lost his Twitter account (@sergeybrinn – is a suspended account). There are some voices that say he had a secret personal Facebook page, but we couldn’t find it. There are a lot of fake accounts instead.  

 

Although he expressly said he’s not more of a social person, he is the one that used Google plus the most from all Google’s officials.  

 

Sundar Pichai, the CEO of Google, last posted on 9 March 2016 about Google’s Deep Mind challenge. He was the second one who gave up on Google +, after Parry Page.  

 

 

And while his Google plus account was left in ruin, his Twitter profile is flourishing. He has 3.69M followers on Google + and only 2.02M on Twitter. Almost ~260 posts and over 1k on Twitter.

 

 

Former Executive Chairman of Google, Eric Schmidt, quit posting on 17 February 2017, long after he left Google in 2015.

 

 

On the other hand, on Twitter the last post was on 3 October 2018. Even if he is posting less often, he’s more active on Twitter than Google +.

 

It’s sad to see that even Google’s management left the social network to have a free fall. The bigger shock is the fact that none of Google’s six independent board members have ever posted publicly on Google+, according to Mashable.

 

Source: mashable.com

 

It makes you wonder when you see these results because Google + has a high influence on business. And the fact that it wasn’t refurbished explains why it got to this point. Sadly, the data breach was inevitably in this case … So, the question that we have is “How could the decision to shut down Google + influence your business and does it affect your SEO?”

 
2. Will Shutting Down Google + Affect Your Business?

 

Probably you’re thinking:

 

If Google’s top representatives don’t use Google+, then why did we?

 

The answer is simple: because of the influence social signals might have.

 

We’ve shown you multiple times that social signals are very important to drive awareness and create authority for a website to push it in SERP. In 2016, we analyzed 23 million shares to see their impact on rankings. We discovered that the average Google+ shares for the 1st rank is significantly higher, so they (might) have value for pushing pages to higher positions in Google.

 

 

Moreover, higher rankings are correlated with Facebook, Google +, LinkedIn & Pinterest high shares altogether. Now that you are losing one link to the chain, it might affect your website. But if all websites lose the same chain, wouldn’t that make loss an equal part for all? Well, that depends. We saw that micro-content that ranks 1st is correlated with high G+ shares.

 

 

Google launched plus one button for websites in 2011. Voices were saying that it used them as ranking signals used for search quality and rankings. But Google denied the allegation and said they had never used Google+ or plus ones as a ranking signal.

 

Google + started as a promising project, but it had a slow death, with nothing intriguing to offer. It was a time when Google tried to make people use it and drive conversations on the social network, by highlighting Google+ content in search results and in Google News, showing you the discussions on Google+. But in the end, it was all for nothing and even made users make fun of it in the SEO community.

 

After a few attempts to refresh the social network, Google stopped pushing it to users and slowly lost interest in it.  

 

The explanation comes in Google’s blog post where they made the announcement of closing the social network, they acknowledged it didn’t receive the fame despite all the efforts. 

 

Our engineering teams have put a lot of effort and dedication into building Google+ over the years, it has not achieved broad consumer or developer adoption, and has seen limited user interaction with apps.

Google

 

Up to this point, we can say that if Google’s algorithms are updated properly so they don’t let G+ signals influence rankings, then any website should be affected. That would be the desired situation. And as I mentioned before, Google + accounts will disappear for all websites so it would be an equal loss. Looking at this with a critical eye, we can say that there might be some cases where some websites could encounter a slight impact. For example, for websites that used mainly G+ to promote their business.

 
3. What’s Next in Your Social Media Strategy

 

What is Google planning to do in the near future now that it has lost Google +? Should we expect to see new Google products or something similar to this one? What we know is what Google stated.

 

All users have time until August 2019 to save their data. So take whatever you need before summer when it will be closed indefinitely. During this time, Google said it will offer additional information to users to help them download that data or migrate it.

 

Google might create new products or features, but only for businesses because as they said, the main focus now is to provide enterprise facilities.

 

We’ve decided to focus on our enterprise efforts and will be launching new features purpose-built for businesses. We will share more information in the coming days.  

Google

 

 

We’ll just have to wait to see what will happen, but one thing is clear: Google says that businesses shouldn’t suffer from this decision as they will come up with a solution. 

The post Google + Is Shutting Down. How Does It Impact Your SEO? appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

Marginal losses: the hidden reason your SEO performance is lagging

Posted by on Oct 12, 2018 in SEO Articles | Comments Off on Marginal losses: the hidden reason your SEO performance is lagging

Marginal losses: the hidden reason your SEO performance is lagging

Without a structured testing program, our experience shows that it’s very likely that most SEO efforts are at best taking two steps forward and one step back by routinely deploying changes that make things worse.

This is true even when the thinking behind a change is solid, is based on correct data, and is part of a well-thought-out strategy. The problem is not that all the changes are bad in theory – it’s that many changes come with inevitable trade-offs, and without testing, it’s impossible to tell whether multiple small downsides outweigh a single large upside or vice versa.

For example: who among us has carried out keyword research into the different ways people search for key content across a site section, determined that there is a form of words that has a better combination of volume vs competitiveness and made a recommendation to update keyword targeting across that site section?

Everyone. Every single SEO has done this. And there’s a good chance you’ve made things worse at least some of the time.

You see, we know that we are modelling the real world when we do this kind of research, and we know we have leaky abstractions in there. When we know that 20-25% of all the queries that Google sees are brand new and never-before-seen, we know that keyword research is never going to capture the whole picture. When we know that the long tail of rarely-searched-for variants adds up to more than the highly-competitive head keywords, we know that no data source is going to represent the whole truth.

So even if we execute the change perfectly we know that we are trading off performance across a certain set of keywords for better performance on a different set – but we don’t know which tail is longer, nor can we model competitiveness perfectly, and nor can we capture all the ways people might search tomorrow.

Without testing, we put it out there and hope. We imagine that we will see if it was a bad idea – because we’ll see the drop and roll it back. While that may be true if we manage a -27% variant (yes, we’ve seen this in the wild with a seemingly-sensible change), there is a lot going on with large sites and even a large drop in performance in a sub-section can be missed until months after the fact, at which point it’s hard to reverse engineer what the change was. The drop has already cost real money, the downside might be obscured by seasonality, and just figuring it all out can take large amounts of valuable analysis time. When the drop is 5%, are you still sure you’re going to catch it?

And what if the change isn’t perfect?

The more black-box-like the Google algorithm becomes, the more we have no choice but to see how our ideas perform in the real world when tested against the actual competition. It’s quite possible that our “updated keyword targeting” version loses existing rankings but fails to gain the desired new ones.

Not only that, but rankings are only a part of the question (see: why you can’t judge SEO tests using only ranking data). A large part of PPC management involves testing advert variations to find versions with better clickthrough rates (CTR). What makes you think you can just rattle off a set of updated meta information that correctly weights ranking against CTR?

Our testing bets that you can’t. My colleague, Dominic Woodman discussed our ODN successes and failures at Inbound 2018, and highlighted just how easy it can be to dodge a bullet, if you’re testing SEO changes.

What I learned From Split Testing – Inbound 2018 Snippet from Distilled
We’re talking about small drops here though, right?

Well firstly, no. We have seen updated meta information that looked sensible and was based on real-world keyword data result in a -30% organic traffic drop.

But anyway, small drops can be even more dangerous. As I argued above, big drops are quite likely to be spotted and rolled back. But what about the little ones? If you miss those, are they really that damaging?

Our experience is that a lot of technical and on-page SEO work is all about marginal gains. Of course on large sites with major issues, you can see positive step-changes, but the reality of much of the work is that we are stringing together many small improvements to get significant year-over-year growth via the wonders of compounding.

And in just the same way that friction in financial compounding craters the expected gains (from this article of the effect of fees on investment returns):

If you’re rolling out a combination of small wins and small losses and not testing to understand which are which to roll back the losers, you are going to take a big hit on the compounded benefit, and may even find your traffic flatlining or even declining year over year.

You can’t eyeball this stuff – we are finding that it’s hard enough to tell apart small uplifts and small drops in the mix of noisy, seasonal data surrounded by competitors who are also changing things measured against a moving target of Google algorithm changes. So you need to be testing.

No but it won’t happen to me

Well firstly, I think it will. In classroom experiments, we have found that even experienced SEOs can be no better than a coin flip in telling which of two variants will rank better for a specific keyword.  Add in the unknown query space, the hard-to-predict human factor of CTR, and I’m going to bet you are getting this wrong.

Still don’t believe me? Here are some sensible-sounding changes we have rolled out and discovered resulted in significant organic traffic drops:

Updating on-page targeting to focus on higher-searched-for variants (the example above)
Using higher-CTR copy from AdWords in meta information for organic results
Removed boilerplate copy from large numbers of pages
Added boilerplate copy to large numbers of pages

Want to start finding your own marginal gains? Click the button below to find out more about ODN and how we are helping clients find their own winners and losers.

CONTACT US TO FIND OUT MORE ABOUT ODN

What is an XML sitemap and why should you have one?

Posted by on Oct 5, 2018 in SEO Articles | Comments Off on What is an XML sitemap and why should you have one?

What is an XML sitemap and why should you have one?

A good XML sitemap acts as a roadmap of your website which leads Google to all your important pages. XML sitemaps can be good for SEO, as they allow Google to quickly find your essential website pages, even if your internal linking isn’t perfect. This post explains what XML sitemaps are and how they help you rank better.

What are XML sitemaps?

You want Google to crawl every important page of your website, but sometimes pages end up without any internal links pointing to them, making them hard to find. An XML sitemap lists a website’s important pages, making sure Google can find and crawl them all, and helping it understand your website structure:

Yoast.com’s XML sitemap

Above is Yoast.com’s XML sitemap, created by the Yoast SEO plugin and later on we’ll explain how our plugin helps you create the best XML sitemaps. If you’re not using our plugin, your XML sitemap may look a little different but will work the same way.

As you can see, the Yoast.com XML sitemap shows several ‘index’ XML sitemaps: …/post-sitemap.xml, …/page-sitemap.xml, …/video-sitemap.xml etc. This categorization makes a site’s structure as clear as possible, so if you click on one of the index XML sitemaps, you’ll see all URLs in that particular sitemap. For example, if you click on ‘…/post-sitemap.xml’ you’ll see all Yoast.com’s post URLs (click on the image to enlarge):

Yoast.com’s post XML sitemap

You’ll notice a date at the end of each line. This tells Google when each post was last updated and helps with SEO because you want Google to crawl your updated content as soon as possible. When a date changes in the XML sitemap, Google knows there is new content to crawl and index.

Even better SEO with Yoast SEO Premium!

Optimize your site for the right keywordsNever a dead link in your site againPreviews for Twitter and FacebookGet suggestions for links as you write$89 – Buy now ▸ More infoIf you have a very large website, sometimes it’s necessary to split an index XML sitemap. A single XML sitemap is limited to 50,000 URLs, so if your website has more than 50,000 posts, for example, you’ll need two separate XML sitemaps for the post URLs, effectively adding a second index XML sitemap. The Yoast SEO plugin sets the limit even lower – at 1.000 URLs – to keep your XML sitemap loading as fast as possible

What websites need an XML sitemap?

Google’s documentation says XML sitemaps are beneficial for “really large websites”, for “websites with large archives”, for “new websites with just a few external links to it” and for “websites which use rich media content”.

Here at Yoast, while we agree that these kinds of websites will definitely benefit the most from having one, we think XML sitemaps are beneficial for every website . Every single website needs Google to be able to easily find the most important pages and to know when they were last updated, which is why this feature is included in the Yoast SEO plugin.

Which pages should be in your XML sitemap?

How do you decide which pages to include in your XML sitemap? Always start by thinking of the relevance of a URL: when a visitor lands on a particular URL, is it a good result? Do you want visitors to land on that URL? If not, it probably shouldn’t be in your XML sitemap. However, if you really don’t want that URL to show up in the search results you’ll need to add a ‘noindex, follow’ tag. Leaving it out of your XML sitemap doesn’t mean Google won’t index the URL. If Google can find it by following links, Google can index the URL.

Example 1: A new blog

Say, for example, you are starting a new blog. You will want Google to find new posts quickly to make sure your target audience can find your blog on Google, so it’s a good idea to create an XML sitemap right from the start. You might create a handful of first posts and categories for them as well as some tags to start with. But there won’t be enough content yet to fill the tag overview pages, making them “thin content” that’s not valuable to visitors – yet. In this case, you should leave the tag’s URLs out of the XML sitemap for now. Set the tag pages to ‘noindex, follow’ because you don’t want people to find them in search results.

Example 2: Media and images

The ‘media’ or ‘image’ XML sitemap is also unnecessary for most websites. This is because your images are probably used within your pages and posts, so will already be included in your ‘post’ or ‘page’ sitemap. So having a separate ‘media’ or ‘image’ XML sitemap would be pointless and we recommend leaving it out of your XML sitemap. The only exception to this is if images are your main business. Photographers, for example, will probably want to show a separate ‘media’ or ‘image’ XML sitemap to Google.

How to make Google find your XML sitemap

If you want Google to find your XML sitemap quicker, you’ll need to add it to your Google Search Console account. In the new Search Console, you can find the sitemaps in the ‘Index’ tab. You’ll immediately see if your XML sitemap is already added to Search Console. If not, you can add your sitemap on top of the page:

Yoast.com’s XML sitemap added to the new Google Search Console

Within the old Google Search Console you can see your sitemaps by navigating to ‘Crawl’ and then clicking on ‘Sitemaps’.  Click on the ‘Add/Test sitemap’ button which you see on the right of the arrow in the image below if you haven’t added your XML sitemap.

Yoast.com’s XML sitemap added to the old Google Search Console

As you can see in the image, adding your XML sitemap can be helpful to check whether all pages in your sitemap really have been indexed by Google. If there is a big difference in the ‘submitted’ and ‘indexed’ number on a particular sitemap, we recommend looking into this further. There could be an error preventing some pages from being indexed or maybe you need more content or links pointing to the content that’s not been indexed yet.

Yoast SEO and XML sitemaps

Because they are so important for your SEO, we’ve added the ability to create your own XML sitemaps in our Yoast SEO plugin. XML sitemaps are available in both the free and premium versions of the plugin.

Yoast SEO creates an XML sitemap for your website automatically. Click on ‘SEO’ in the sidebar of your WordPress install and then select the ‘Features’ tab:

In this screen, you can enable or disable the different XML sitemaps for your website. Also, you can click on the question mark to expand the information and see more possibilities, like checking your XML sitemap in your browser:

You can exclude content types from your XML sitemap in the ‘Search Appearance’ tab. If you select ‘no’ as an answer to ‘show X in the search results?’ then this type of content won’t be included in the XML sitemap.

Read more about excluding content types here.

Check your own XML sitemap!

Now you’ve read the whole post, you know how important it is to have an XML sitemap, because having one can really help your site’s SEO. Google can easily access your most important pages and posts if you add the right URLs to your XML sitemap. Google will also be able to find updated content easily, so they know when a URL needs to be crawled again. Lastly, adding your XML sitemap to Google Search Console helps Google find your sitemap fast and it allows you to check for sitemap errors.

Now go check your own XML sitemap and make sure you’re doing it right!

Read more: WordPress SEO tutorial: definite guide to higher ranking »

The post What is an XML sitemap and why should you have one? appeared first on Yoast.

SEO Title Tags (Everything You Need to Know)

Posted by on Sep 28, 2018 in SEO Articles | Comments Off on SEO Title Tags (Everything You Need to Know)

SEO Title Tags (Everything You Need to Know)

Optimizing your title tags for SEO is simple:

Just throw your keyword in the title and you’re good to go, right?

Yes and no.

You could stop there and probably do pretty well (if you’ve done everything else right).

But the truth is:

There’s so much more you can do to optimize your title tags.

That’s what this guide is all about.

Make sure you read until the end because I’ll be sharing some title tag optimization tactics that will skyrocket your organic search CTR.

Let’s jump in.

What is a Title Tag?

As the name suggests, a HTML title tag is an element of your web page’s HTML code that indicates its title. It is often used to let both search engines and people know what the page’s content is all about.

You can only have one title tag per page. It will appear in your code as:

<head>
<title>Example of a Title Tag</title>
</head>

Most people will encounter your title tag in four places:

1. Web Browser Tabs

The title tag can be seen on your web browser when you open your page in a new tab.

This is especially helpful when a user has many tabs open and would like to go back to your content. Because of this, it’s important that your title tags are unique, easily recognizable. and can be immediately differentiated from other open tabs.

2. Browser Bookmarks

Browser bookmarks on Chrome show the website’s title by default. As you’ll notice below, the titles are usually truncated when it’s on the “Bookmarks Bar”.

However, you can see most of a page’s title if you’re using folders. This is a good reason why you should use short, but descriptive titles. More on this soon.

3. Shared Media on Social Media Platforms

You know those little previews on Facebook and Twitter when someone shares content on those platforms? Your title tag will show up there as well, letting people know what the page is about and what they can expect to find when they click on that link.

Some social networks will allow you to customize your title tag just for their platform. An enticing title tag helps draw in more visitors.

If you’re on WordPress, you can customize your OG data using Yoast and All-in-One SEO pack. You can also download download this OG plugin. It doesn’t require any set up and it will ensure that your “Featured Image” shows up when people share your content on social.

If you’re having issues with your Featured Image not showing, use the following:

Facebook’s Debugger tool (you can force Facebook to recrawl your page).
LinkedIn’s Post Inspector
Twitter’s Card Validator

4. In the SERPs (Search Engine Results Pages)

One of the most important places where your title will show is in Search Engine Results Pages (that includes Google, Bing, Yahoo, DuckDuckGo, etc).

The title tag shows up as a big, blue clickable link above a short meta description or summary.

This means that if someone found your web page by searching a term that is related to your business, this is your first chance to make a lasting impression and convince them to click on your website.

It’s very easy to add a title tag to your website, but writing an effective one takes time, research, and a little skill (that’s easily developed).

But first:

Why are Title Tags Important for SEO?

Some blogs will tell you that title tags are obsolete in 2018. This is misleading. While title tags may not play the same role in SEO as they did a decade ago, there are still many reasons not to neglect this low-effort, high-impact SEO action.

Here are the benefits of optimizing your title tags (the right way):

1. Keyword Rankings

Do you need to place your target keyword in the title tag to rank well in Google?

The short answer is “Yes”.

The longer answer is that it may not be as important as it once was.

Brian’s research found that having the keyword in the title tag does impact rankings, but it’s a small factor in comparison to other factors:

Image Source: Backlinko.com

Ahrefs also found that “there’s a slight correlation between the usage of keywords in the title tag and rankings.”

Image Source: Ahrefs.com

And finally, one last case study from Matthew Barby also indicated that “The presence of keywords in the page title” does correlate to higher rankings.

Image Source: MatthewBarby.com

Truth be told:

I’ve never attempted to rank pages without using the target keyword phrase in the title tag.

That’s because it wouldn’t make sense me to stop doing what’s working.

My recommendation will continue to be that you should place your target keyword in the title tag. Just keep in mind that it’s a small factor in the larger ranking equation.

2. SERP Click Through Rate (CTR)

Although there’s some debate about CTR being a ranking factor, there’s no denying that increasing your CTR will increase your organic search traffic.

And just to be clear:

The goal of SEO is to get more organic search traffic. When you change your mindset from “rankings” to “traffic” it changes the way you operate.

Optimizing your title tag for maximum CTR is an intelligent action to take.

I’ll explain some tactics you can use to achieve that goal in a second.

Side note: I lean towards CTR being a direct or least an indirect ranking factor. The way I look at is there’s no benefit of NOT optimizing for CTR. Even if it isn’t a ranking factor.

Ross Hudgens from Siege Media has an excellent video on this topic, worth a watch:

.embed-container { position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden; max-width: 100%; } .embed-container iframe, .embed-container object, .embed-container embed { position: absolute; top: 0; left: 0; width: 100%; height: 100%; }

TL;DW: CTR may not be direct ranking factor, but it likely impacts rankings indirectly.

3. Social Sharing

Your page’s title is a focal point when it’s shared on social media. Does that mean you need to use clickbait titles like this?:

No, but you should think about why clickbait works.

The truth is clickbait is only annoying when the actual content doesn’t add real value.

4. Headlines Matter

What you place in your title tag is nothing more than a headline. You’ve probably heard the idea that only 8 out 10 Internet users will read past the headline.

Or that:

“Five times as many people read the headline as read the body copy. When you have written your headline, you have spent eighty cents out of your dollar.” – Confessions of an Advertising Man (1963) by David Oglivy

The truth is:

If you’re reading this, then you’re in the minority.

In fact:

Most people only make it through around 17-20% of my content before returning back to watching cat videos.

But regardless, the copy you use within your title tag is the first touch point for readers.

You have to do it well or your engagement will be low.

Those are four important reasons why you need to optimize your title tag, but now I need to cover a few important questions:

Does Google Rewrite Titles?

If Google doesn’t think that your title is relevant, readable, or provides value to your site’s visitors, it can and will completely rewrite it – and often in ways that you won’t like.

In fact, here’s what Gary Illyes said:

“We will never quit rewriting titles. We’ve seen so many sites whose title really suck. A lot of sites have no title; a lot of sites have a title saying “Top Page”. In fact, Google almost always rewrites titles. We couldn’t provide useful results to our users if we quit rewriting titles. Experiments showed us users preferred written titles. So, we’ll continue to write titles.” – Gary Illyes (Source)

It’s pretty clear based on Gary’s words that Google’s algorithms will rewrite your titles (and isn’t planning on stopping anytime soon).

But what can you do to prevent it?

The #1 thing you can do is make sure that your title matches your page’s content/intent. If your title is “Buy Shoes”, but your page is all about “buying blue Nikes”, then Google will likely rewrite your title.

Your title should be a 100% match of the page content.

One other factor you need to consider is title tag length.

How Long Should Your Title be?

There are technically no character limits to your title tag, but search engines can only display so much of your title before cutting it off.

If your title is too long, Google will cut it off with an ellipsis (…), which could potentially prevent site visitors from seeing important information about the page.

According to Moz’s research, Google usually displays the first 50-60 characters (including spaces) of a title tag, but the more accurate limit would be 600px. This is because some characters (like M, W, etc.) take up more space than others.

Staying under 60 characters is a good rule of thumb, but you can also use many title tag preview tools like this one just to be sure.

If you’re on WordPress, Yoast and All-in-One SEO pack will do the job.

If you want to find titles tag that are too long at scale, then I recommend using Screaming Frog.

Open up Screaming Frog, enter your target domain, click on the “Page Titles” tab, and select “Over 65 Characters” from the filter:

You can click on each individual URL and preview what the title tag looks in the SERPs. Just click “SERP Snippet” at the bottom:

Can Your Title Tag and H1 be the Same?

The short answer is, yes. You should try to keep your H1 tag consistent with your title tag, but that doesn’t mean it has to be an exact match. For example, this page has a different title tag and H1 tag:

One method you can use is to vary your H1 from your title tag to rank for more long-tail keyword variations. I prefer keeping my H1 nearly identical to the title, but it’s elements to test for sure.

You can use Screaming Frog to find all titles that are the same as your H1 tags.

Open up Screaming Frog, enter your target domain, click on the “Page Titles” tab, and select “Same as H1” from the filter:

With some of those important title tag questions out of way, let me show you:

22 Easy Ways to Optimize Your Title Tags for SEO

Since we’ve already established that a good title tag is a low-effort way to optimize both your SERP ranking and your CTR, how exactly do you go about writing one?

Here are 22 ways to optimize your title tags for better rankings, CTR, and social sharing:

1. Focus on the Content First

That’s right. The first action you need to take is to make sure your SEO content is the highest quality possible. It doesn’t matter how well you optimize your title tag if the page itself is low-value.

Getting the click is important, but getting visitors to dwell longer, visit more than one page, or complete a goal is what the objective should be. That’s only possible if you’re crafting effective SEO content.

Don’t take this step lightly!

2. Identify the Page Type

How you craft your titles will depend on the page type. For example, optimizing a title tag for a product page will be much different than a blog post.

There are a few different types of SEO-driven pages that a website will have:

Homepages

If you decide to optimize your homepage for a target keyword, there’s a good chance it will have middle or bottom of the funnel intent. For example, Hubspot targets “inbound marketing software” with their homepage.

This keyword phrase has transactional intent so their homepage is structured to drive leads for their software (not educate).

Notice the effective use of a curiosity gap at the end of their title tag as well.

Category Pages

E-commerce websites are the most likely candidate to try to rank category pages. However, there are some information-driven websites where it makes sense.

For example, RTINGS have a beautifully-structured category page for the target keyword phrase “tv reviews”.

Although the keyword phrase “tv reviews” may lead to sale in the future, I still consider it to be top of the funnel intent. Or, informational in nature.

Notice that RTINGS front-loads their primary keyword phrase and use not one, but two modifiers (“Best” and “2018).

Product Pages

Many product pages will target a combination of Navigational/Transactional keyword phrases. For example, take a look at the keyword phrase “Nike trout 4 cleats”.

Someone searching this keyword is primed to buy, so the title tag needs to reflect that intent.

Local Pages

Keyword stuffing title tags seems to be a common practice on the local level. After digging around, I was able to find an interesting example for the keyword phrase “Los Angeles personal injury lawyer”.

Although I don’t love the idea of jamming “car accident lawyers” in the title, I do like a few things about this title. First, they’ve front-loaded their primary keyword. Second, they’re using numbers within their title, which makes it much more eye-grabbing.

Blog Posts

Crafting title tags for blog posts is the easiest to understand.

Your goal should be to make your title as accurate and interesting as possible. The following tips can drastically improve your blog post title performance.

Most blog posts are going to target keyword phrase with Informational intent, so you need to satisfy that.

3. Satisfy Searcher Intent

This applies to both your title and the page itself. The best way to satisfy searcher intent is to think about it from a funnel or buyer journey perspective.

There are four primary categories of searcher intent:

Informational – These are top of the funnel search queries such as “what is SEO”.
Comparison – These are middle of the funnel search queries such as “Ahrefs vs Moz”.
Transactional – These are bottom of the funnel search queries such as “Moz free trial”.
Navigational – These types of search queries are branded like “Gotch SEO”. This means the searcher already knows your brand or may already be a customer.

Most keyword phrases will fall under one or more of these categories.

Your title must satisfy the search intent behind keyword phrase you’re targeting. You do not want ambiguity. Make it as clear as possible for the searcher.

4. Front-Load Your Primary Keyword

If you approach crafting your title tags from a searcher intent perspective, it would make sense to have the keyword phrase front-and-center. If someone’s searching for “best baseball cleats”, they’re likely to click on a result that showcases that keyword right away.

Keep in mind that “front-loading” doesn’t mean that your keyword phrase needs to be first in the title tag. It just needs to be towards the beginning.

5. Write for Searchers, Not Search Engines

Yes, place your keyword in your title, but don’t do this:

“SEO Company | SEO Agency | Chicago SEO Company”

You wouldn’t believe how often we find this type of keyword stuffing in our SEO audits (check out our SEO audit service if you need help).

There a few reasons why you shouldn’t stuff keywords in your title tag:

It’s Not Necessary

Google’s algorithms are much more sophisticated than before. More specifically, Google’s Hummingbird algorithm is designed to understand content better.

That means it can identify synonyms and variations of your keywords. You don’t need to jam keyword variations into your title tag. Instead, you can place keyword variations or synonyms naturally throughout your copy and you’ll still perform well for them (given you did everything else right).

You Should Only Target One Primary Keyword Phrase Per Page

Although there are some exceptions to the rule (super authoritative websites), you should aim to target one primary keyword per page.

You’re Losing Precious Real Estate

Most keyword phrases aren’t persuasive in any way. When you stuff your title tag full of keywords, you’re losing the ability to add elements of effective copywriting and persuasion. I’ll be explaining some of these tactics in a second.

6. Use Shorter Titles

Matthew Barby’s research found that shorter titles tend to perform better in Google:

Image Source: MatthewBarby.com

Try to stay below 60 characters (including spaces).

If you’re struggling to keep it below 60 characters than you should try:

Avoid using all-caps in your title tag. Capital letters take up more space than lowercase letters.
Avoid using punctuation when necessary
Remove redundant or repetitive words
Use short phrases instead of long, complicated ones

7. Avoid Duplicating Page Titles

No two pages (that you want indexed in Google) should have the same title. The best way to find duplicate page titles is to use Screaming Frog SEO Spider.

Open up Screaming Frog SEO Spider, enter the target domain, and click on the “Page Titles” tab:

Then click the “Filter” dropdown and select “Duplicate”:

Sort the list by “Title 1”:

You only need to be concerned about duplicate title tags if your page is indexed. The new version of Screaming Frog makes this super easy with their new “Indexability” column.

8. Write Unique Titles for EVERY Page

Every page on your website should have a unique title. In fact, according to Google:

“Titles are critical to giving users a quick insight into the content of a result and why it’s relevant to their query. It’s often the primary piece of information used to decide which result to click on, so it’s important to use high-quality titles on your web pages.” – Google

The best way to find pages with missing titles is to once again, use Screaming Frog SEO Spider.

The steps are identical as above except you’ll select “Missing”:

9. Use Title Modifiers

If you dig through my content on Gotch SEO, you’ll discover that I love using title modifiers. I believe using title modifiers is one of the best ways to drive more long-tail traffic (without much effort).

I actually call this The Phantom Technique because many of these keyword variations are largely untapped.

Here’s a free video from my paid training course, Gotch SEO Academy explaining how to execute this tactic:

Like this training? Join Gotch SEO Academy today and save 20% when you use coupon code “titletag” at checkout.

With that said:

Some simple title modifiers you can use are “top”, “best”, or the year.

Important note: If it’s relevant to use a year in your title tag, make sure that your URL doesn’t include it. For example, I update my anchor text guide every year and change the year in the title tag, but the URL never changes.

That means I can continue to build the authority of that page because my URL isn’t changing every year.

10. Build a Keyword Variation List

I also build a keyword variation list every time I find a new primary keyword phrase to target. For example, my primary keyword phrase for my backlinks guide is “backlinks”.

But obviously my title couldn’t just be “Backlinks | Gotch SEO” because that’s A) boring and B) I would lose out on long-tail traffic.

Instead, I searched for relevant keyword variations I could naturally add to the title.

Ahrefs Keyword Explorer is perfect for this task.

Enter your primary keyword phrase, start the analysis, and then click on “Phrase Match”:

This section is a goldmine for finding keyword variations for your title.

You can also use UberSuggest and Keywords Everywhere to build your keyword variation list (both are free).

Although you won’t use 99.9% of these variations in your title tag, a large percentage of these keywords can dispersed throughout your page.

11. Emphasize Freshness

Do you know anyone that prefers old content? I don’t and that’s why emphasizing “freshness” in your title works really well.

One persuasion principle that I picked up from Frank Kern is that people love “new” things. In fact, something simply being “new” can be a big driving force.

Hence the reason why you’re more likely to buy a newer model car than a car from the 80s.

Another example if when you see a training course use “2.0” or “Revamped” in their headline. They’re emphasizing freshness.

Some ways to incorporate freshness into your title tags are to use the word “new”, “updated for YEAR”, “new data”, etc.

12. Use the H & W Strategy

The H & W strategy simple: Just use one of the following words in your title tag:  “How,” “What,” “Why,” “When,” “Where,” or “Who.”

How to {Create|Learn|Build|Use|Leverage|Increase|Get|Do}…

Example: How to Tie a Windsor Knot

Total Organic Keywords: 5,079
Total Linking Root Domains: 161
Total Social Shares: 819 (Buzzsumo)

What {are|is}?

Example: What Are Second Cousins vs. Cousins Once Removed

Total Organic Keywords: 2,600
Total Linking Root Domains: 59
Total Social Shares: 1.9 Million (Buzzsumo)

Why

Example: Why the Myers-Briggs Test is Meaningless

Total Organic Keywords: 2,500
Total Linking Root Domains: 77
Total Social Shares: 19,000 (Buzzsumo)

When

Example: 21 High-Protein Snacks To Eat When You’re Trying To Be Healthy

Total Organic Keywords: 1,800
Total Linking Root Domains: 32
Total Social Shares: 28,000 (Ahrefs)

Where

Example: The Complete Guide to Where to Put Your Eye Makeup

Total Organic Keywords: 5,200
Total Linking Root Domains: 33
Total Social Shares: 26,000 (Ahrefs)

13. Use Numbers

We’ve all been victim of consuming numbered listicles at one point or another. That’s because they’re super effective.

According to a study by Conductor, they found that 36% of respondents preferred headlines that included numbers:

Image Source: Moz.com

An example of an effective listicle post is “18 Unforgettable Countries Where You Can Roll Big on $50 a Day“. This example ranks for “cheapest countries to visit” (~3,600 searches/mo), has 45 linking root domains, and over 81,000 social shares.

Outside of the traditional listicle, you can also use monetary values such as: “Silicon Valley’s $400 Juicer May Be Feeling the Squeeze

Or, you can use percentages in title tags like this: “Nike’s online sales jumped 31% after company unveiled Kaepernick campaign“.

14. Use This Secret Title Tag Hack (Copywriters Hate It)

Ahh… yes, the classic clickbait headline.

I know I’ve fallen for many, but that’s because they work well! Mainly because they leave open loops in your mind and engage our natural human curiosity.

The trick here is to give readers a sneak peek into what they can find out by clicking on your link without giving too much away.

Employ as much tantalizing language as necessary; remember: you need to evoke surprise, amazement, or speak to a deeply-rooted fear. You can combine this technique with the other techniques above to create a truly click-worthy headline.

Example: 7 Unbelievable Exercises That Will Help Keep Your Nose In Shape

Total Organic Keywords: 3,500
Total Linking Root Domains: 17
Total Social Shares: 12,000 (Ahrefs)

Note: Use clickbait tactics few and far between because they can be annoying or unauthentic. Overuse could hurt your brand’s perceived value.

15. Be the Most Comprehensive

Fear of Missing Out (FOMO) applies in many different scenarios, but especially with knowledge gaps. People want assurance that they aren’t missing out on any important information.

That’s why {Complete|Ultimate|Definitive} guides work well.

Example: The Ultimate Guide To Brunching In NYC

Total Organic Keywords: 3,300
Total Linking Root Domains: 62
Total Social Shares: 48,000 (Ahrefs)

16. Emphasize Speed (or Time Savings)

One of the most powerful benefits to emphasize is saving time. Although this usually applies to products, it can be emphasized in title tags as well.

Use words like “fast”, “quick”, “simple”, etc.

Example: How to Get Rid of Stretch Marks Fast

Total Organic Keywords: 4,200
Total Linking Root Domains: 113
Total Social Shares: 160,000 (Ahrefs)

17. Break the Pattern

Pattern interrupts are common in video content, but there are ways to break the pattern in the SERPs as well. Some the best methods are use [brackets], {curly brackets}, (parentheses), equal signs (=), plus (+) or minus (-) signs, or pretty much any unordinary symbol.

You can also test using Emojis in title tags as well. Google doesn’t always show them though.

18. Use Title Tags to Find Keyword Cannibalization

Keyword cannibalization occurs when two or more pages on your website are optimized for the same keyword phrase. Auditing your title tags using Screaming Frog SEO Spider is actually one of the fastest ways to identify keyword cannibalization.

Open up SFSS, enter your target domain, click on the “Page Titles” tab, and keep the filter set to “All”:

You can then use SFSS’s built-in search function to find pages that are similar. In this example below, I searched “backlinks” and identified two pages using that primary keyword phrase.

In this case, it doesn’t make sense to consolidate these assets because the intent behind “how to build backlinks” vs “buy backlinks” are much different.

Identifying keyword cannibalization issues requires manual analysis, but it’s time well spent.

19. Test Your Titles

How do you know if your title will be effective? Well, the good news is that it doesn’t have to be a shot in the dark. I recommend using AM Institute’s tool to test and refine your titles before going live:

You can also use CoSchedule’s free headline analyzer tool as well.

20. Incorporate All the Methods

The good news is that you don’t need to be exclusive with what techniques you use. Mix and match the title tag optimization methods to get the best results possible.

21. Measure Performance with Google Search Console

Google Search Console shows you CTR data for your organic keywords. Just click on the “Performance” tab and you’ll access to all kinds of useful data:

Although your CTR is determined by more than just your title tag, it’s one of the most important factors. If you are ranking well, but your CTR is subpar, then you should test changing your title.

Here’s a simple title tag testing framework I use:

Create 10-20 title variations
Qualify the idea using AM Institutes tool
Execute the change
Annotate the change in Google Analytics
Wait (at least 3-4 weeks) – You need to give Google time to recrawl the page and see whether there’s a positive or negative impact.

The goal of these tests is to increase CTR.

Keep in mind: Navigational search queries (that aren’t your brand name) like “Blogspot” (I’ve been floating between the #2 – #5 spot) will have low CTR:

Changing your title tag won’t do much in this scenario because it’s based on intent.

On the other hand:

Navigational search queries that ARE for your brand (branded search) should have exceptionally CTR:

22. Be Realistic

All of these methods will help you optimize your title tags for peak SEO performance.

But don’t forget:

Placing your keyword in your title tag is a micro ranking factor.

Think of it as the bare minimum for ranking well.

That’s All for Title Tags!

I hope this guide helped you learn a thing (or two) about title tags.

If you got a lot of value out of this post please share it and drop a comment below because I respond to every single one

The Secret Behind My 1,866,913 Monthly Search Visitors (It’s Not What You Think)

Posted by on Sep 21, 2018 in SEO Articles | Comments Off on The Secret Behind My 1,866,913 Monthly Search Visitors (It’s Not What You Think)

The Secret Behind My 1,866,913 Monthly Search Visitors (It’s Not What You Think)

How many visitors do you think NeilPatel.com generates each month?

Maybe a million… maybe 2 million?

I bet you’re going to guess 1,866,913.

If that’s what you guessed, you are wrong. This blog actually generated 2,530,346 visitors. 1,866,913 is the number that came from search engines.

So, what’s the secret to my ever-growing Google traffic?

Sure, I have optimized my on-page SEO, I’ve built links, written tons of blog post… I’ve done all of the stuff that most of my competition has done. But doing the same stuff as your competition isn’t enough.

My secret sauce is that I optimize for user signals.

Last week, I broke down some of the user signals Google looks at, as well as providing benchmarks to aim for if you don’t want to be penalized by Google.

If you aren’t familiar with user signals, check the article I linked to above.

So, how do you optimize for user signals?

Well, I know everyone has different types of websites, so I thought I would share the process I use to optimize NeilPatel.com.

Are you showing people what they want?

Google Analytics is an amazing tool. I’m so addicted to it that I log in at least 3 or 4 times a day. Heck, I even log in on weekends.

But here’s the thing, it only tells you half the story. It gives you numbers, but it doesn’t help you visualize what people are doing and what they aren’t.

For example, here is what my main blog page looked like according to Crazy Egg:

What’s wrong with the image?

Everyone is going to the blog to learn more about marketing. Above the fold, I have a box that showcases an SEO Analyzer. But there is one big issue: it’s barely clicked compared to the drop-down that lets you filter the blog content.

The SEO Analyzer had 128 clicks versus 359 clicks to the content filtering option.

Because you didn’t care for it as much, I removed it from the main blog page. And now when you head to the blog page you can see the filtering options above the fold.

I am looking to see what you click on and what you don’t. Simple as that.

If I keep showing you something you aren’t clicking on, I am wasting the opportunity to present you with something you do want to see. Which means I either need to adjust it or delete it.

Now, let me show you my current homepage:

What’s wrong?

Go ahead, take a guess…

Well, looking at the image you’ll notice there are tons of hot spots in the footer. That’s where the navigation is. With there being all of the clicks on the navigation, I should consider adding a navigation menu bar in the header.

Are you getting the hang of how to make your website more user-friendly? Well, let’s try another one.

Here’s an element in the sidebar of my blog posts:

That element only has 1 click. That’s terrible considering that the blog post generated 10,016 visits. And to top it off, that click came from a repeat visitor.

My goal is to convert more first-time visitors into leads, which makes up the majority of my visitors, but they are the lowest percentage of my leads.

So, what did I do? I deleted that element and you no longer see it in my sidebar.

Are you optimizing for mobile?

Let’s face it, more people are visiting your site using mobile devices than laptops or traditional computers.

If that’s not the case, it is just a matter of time.

So, have you optimized your site for mobile? And no, I’m not just talking about having a responsive design because everyone is doing that these days.

If you look at the image above, you’ll notice that I removed the image of myself and a few other elements. This helps make the loading experience faster and it helps focus people’s attention on the most important elements.

Similar to the desktop version, my mobile homepage has a 24% conversion rate. When my mobile version included a picture of me above the fold, my conversion rate dropped to 17%… hence there is no picture of me. 😉

Now, I want you to look at the mobile version of my main blog page and compare it to my homepage.

Do you see an issue?

The blog page generates a lot of clicks on the 3 bars at the top… that’s my navigation menu.

My developer accidentally removed that from the mobile homepage. That’s why the contact button in the footer of the homepage gets too many clicks.

Hopefully, that gets fixed in the next day or two as that could be negatively impacting my mobile rankings.

On top of optimizing the mobile experience, you need to ensure your website loads fast. It doesn’t matter if people are using LTE or 4G, sometimes people have terrible reception. And when they do, your website will load slow.

By optimizing it for speed, you’ll reduce the number of people who just bounce away from your site.

If you want a faster load time, follow this.

And don’t just optimize your site for speed once and forget about it. As you make changes to your site, your pagespeed score will drop, which means you’ll have to continually do it.

For example, you’ll notice I have been making a lot of change to NeilPatel.com (at least that is what the heatmaps above show). As I am making those changes, sometimes it affects my pagespeed score negatively. That means I have to go back and optimize my load time again.

A second in load time delay on average will cost you 6.8% of your revenue.

Are you focusing on helping all of your users?

Not every person who visits your website is the same.

For example, a small percentage of the people who visit NeilPatel.com work at large corporations that are publicly traded and are worth billions of dollars.

And a much larger percentage of my visitors own small and medium-sized businesses. These people are trying to figure out how to grow their traffic and revenue without spending an arm and a leg.

And the largest percentage of my visitors don’t have a website and they are trying to figure out how to get started for free.

In a nutshell, I have three groups of people who visit my website. The first group tends to turn into consulting leads for my agency, but they make up the smallest portion of my traffic.

One could say that I should only focus on helping them and ignore everyone else. But I can’t do that for a few reasons…

I started off with having practically no money and people helped me out when I couldn’t afford to pay them. I love paying it forward and helping people who can’t afford my services because I have been there, and I know what it’s like.
If I only focused on the large companies, who would link to my website and promote my content? You can bet that Microsoft isn’t going to link to me on a regular basis. If you want to generate social shares and backlinks you have to focus on the masses.
Little is the new big… if you can please the masses, they will make noise and the big players will eventually hear about you. So, don’t just treat people with deep pockets kindly, treat everyone the same and truly care about your visitors.

Once you figure out the types of people coming to your website (and if you are unsure just survey them), go above and beyond to help them out. Create different experiences for each group.

On NeilPatel.com, I’ve learned that people who work at large corporations are busy and they want to listen to marketing advice on the run. For that reason, I have the Marketing School podcast.

And a lot of beginners wanted me to break down my steps over video, so they can more easily replicate my tactics. For that reason, I create new videos 3 times per week giving marketing and business advice.

Many of you want to attend the conferences that I speak at, but can’t afford to buy a ticket. For those people, I create weekly webinars that are similar to the speeches I give at conferences.

And best of all, I know the majority of you find it hard to follow along with all of these tips as it can be overwhelming. So, I created Ubersuggest to help you out.

In other words, I try to go above and beyond for all of my visitors.

Yes, it is a lot of work, but if you want to dominate an industry it won’t happen overnight. Expect to put in a lot of time and energy.

Are you taking feedback from people?

You are going to get feedback. Whether it is in the form of email or comments, people will give you feedback.

It’s up to you if you want to listen… but if a lot of people are telling you the same thing you should consider it.

For example, I get a ton of comments on YouTube from people asking me to create videos in Hindi.

And…

Now, I am not only working on adding Hindi subtitles to my videos, but I am also working on translating my blog content to Hindi.

I’m not doing these to make more money… I’m not doing this to become popular… I’m just trying to do this to help out more people.

It’s the same reason why I have Spanish, Portuguese, and German versions of this website. I had enough requests where I pulled the trigger even though I am not focusing on generating income in those areas.

But here is the thing that most people don’t tell you about business. If you just focus on helping people and solving their problems, you’ll notice that your income will go up over time.

Businesses make money not because their goal is to make money… they make money because they are solving a problem and helping people out.

Another piece of feedback I have been getting recently is that my blog is too hard to read on mobile devices.

For that reason, I’ve assigned a task to one of my developers to fix this.

Conclusion

Traffic generation is a business. It’s not a hobby. It’s competitive, and it’s difficult to see short-term gains.

If you want to rank at the top of Google, you can’t treat your website as a hobby. You have to treat it like a business.

And similar to any business, you won’t succeed unless you pay attention to the needs of your customers. That means you have to listen to them. Figure out what they want and provide it.

That’s what Google is trying to do. They are trying to rank sites that people love at the top of their search engine. If you want to be one of those sites, then start paying attention to your visitors.

Show them what they want and go above and beyond so that they will fall in love with your website instead of your competition.

If you aren’t sure if you are making the right changes, monitor your brand queries. The more people that are searching for your brand terms on Google is a big leading indicator that people are happy with your website.

Just look at NeilPatel.com: I get over 40,000 visitors a month from people Googling variations of my name:

And I generate over 70,000 visits a month just from people searching for my free tool, Ubersuggest.

That’s how I’m continually able to make my traffic grow.

Yes, I do pay attention to what Google loves, but more importantly, I pay attention to your needs and wants.

Are you going to start optimizing your website for user signals?

The post The Secret Behind My 1,866,913 Monthly Search Visitors (It’s Not What You Think) appeared first on Neil Patel.

Local Business Transparency &amp; Empathy for the Holidays: Tips + Downloadable Checklist

Posted by on Sep 14, 2018 in SEO Articles | Comments Off on Local Business Transparency &amp; Empathy for the Holidays: Tips + Downloadable Checklist

Posted by MiriamEllis

Your local business will invest its all in stocking shelves and menus with the right goods and services in advance of the 2018 holiday season, but does your inventory include the on-and-offline experiences consumers say they want most?

Right now, a potential patron near you is having an experience that will inform their decision of whether to do business with you at year’s end, and their takeaway is largely hinging on two things: your brand’s transparency and empathy.

An excellent SproutSocial survey of 1,000 consumers found that people define transparency as being:

Open (59%)
Clear (53%)
Honest (49%)

Meanwhile, after a trying year of fake news, bad news, and privacy breaches, Americans could certainly use some empathy from brands that respect their rights, needs, aspirations, and time.

Today, let’s explore how your local brand can gift customers with both transparency and empathy before and during the holiday season, and let’s make it easy for your team with a shareable, downloadable checklist, complete with 20 tips for in-store excellence and holiday Google My Business best practices:

Grab the Holiday Checklist now!

For consumers, even the little things mean a lot

Your brother eats at that restaurant because its owner fed 10,000 meals to displaced residents during a wildfire. My sister won’t buy merchandise from that shop because their hiring practices are discriminatory. A friend was so amazed when the big brand CEO responded personally to her complaint that she’s telling all her social followers about it now.

Maybe it’s always been a national pastime for Americans to benefit one another with wisdom gained from their purchasing experiences. I own one of the first cookbooks ever published in this country and ‘tis full of wyse warnings about how to avoid “doctored” meats and grains in the marketplace. Social media has certainly amplified our voices, but it has done something else that truly does feel fresh and new. Consider SproutSocial’s findings that:

86% of Americans say transparency from businesses is more important than ever before.
40% of people who say brand transparency is more important than ever before attribute it to social media.
63% of people say CEOs who have their own social profiles are better representatives for their companies than CEOs who do not.

What were customers’ chances of seeking redress and publicity just 20 years ago if a big brand treated them poorly? Today, they can document with video, write a review, tweet to the multitudes, even get picked up by national news. They can use a search engine to dig up the truth about a company’s past and present practices. And… they can find the social profiles of a growing number of brand representatives and speak to them directly about their experiences, putting the ball in the company’s court to respond for all to see.

In other words, people increasingly assume brands should be directly accessible. That’s new!

Should this increased expectation of interactive transparency terrify businesses?

Absolutely not, if their intentions and policies are open, clear, and honest. It’s a little thing to treat a customer with fairness and regard, but its impacts in the age of social media are not small. In fact, SproutSocial found that transparent practices are golden as far as consumer loyalty is concerned:

85% of people say a business’ history of being transparent makes them more likely to give it a second chance after a bad experience.
89% of people say a business can regain their trust if it admits to a mistake and is transparent about the steps it will take to resolve the issue.

I highly recommend reading the entire SproutSocial study, and while it focuses mainly on general brands and general social media, my read of it correlated again and again to the specific scenario of local businesses. Let’s talk about this!

How transparency & empathy relate to local brands“73.8% of customers were either likely or extremely likely to continue to do business with a merchant once the complaint had been resolved.”
GetFiveStars

On the local business scene, we’re also witnessing the rising trend of consumers who expect accountability and accessibility, and who speak up when they don’t encounter it. Local businesses need to commit to openness in terms of their business practices, just as digital businesses do, but there are some special nuances at play here, too.

I can’t count the number of negative reviews I’ve read that cited inconvenience caused by local business listings containing wrong addresses and incorrect hours. These reviewers have experienced a sense of ill-usage stemming from a perceived lack of respect for their busy schedules and a lack of brand concern for their well-being. Neglected online local business information leads to neglected-feeling customers who sometimes even believe that a company is hiding the truth from them!

These are avoidable outcomes. As the above quote from a GetFiveStars survey demonstrates, local brands that fully participate in anticipating, hearing, and responding to consumer needs are rewarded with loyalty. Given this, as we begin the countdown to holiday shopping, be sure you’re fostering basic transparency and empathy with simple steps like:

Checking your core citations for accurate names, addresses, phone numbers, and other info and making necessary corrections
Updating your local business listing hours to reflect extended holiday hours and closures
Updating your website and all local landing pages to reflect this information

Next, bolster more advanced transparency by:

Using Google Posts to clearly highlight your major sale dates so people don’t feel tricked or left out
Answering all consumer questions via Google Questions & Answers in your Google Knowledge Panels
Responding swiftly to both positive and negative reviews on core platforms
Monitoring and participating on all social discussion of your brand when concerns or complaints arise, letting customers know you are accessible
Posting in-store signage directing customers to complaint phone/text hotlines

And, finally, create an empathetic rapport with customers via efforts like:

Developing and publishing a consumer-centric service policy both on your website and in signage or print materials in all of your locations
Using Google My Business attributes to let patrons know about features like wheelchair accessibility, available parking, pet-friendliness, etc.
Publishing your company giving strategies so that customers can feel spending with you supports good things — for example, X% of sales going to a local homeless shelter, children’s hospital, or other worthy cause
Creating a true welcome for all patrons, regardless of gender, identity, race, creed, or culture — for example, gender neutral bathrooms, feeding stations for mothers, fragrance-free environments for the chemically sensitive, or even a few comfortable chairs for tired shoppers to rest in

A company commitment to standards like TAGFEE coupled with a basic regard for the rights, well-being, and aspirations of customers year-round can stand a local brand in very good stead at the holidays. Sometimes it’s the intangible goods a brand stocks — like goodwill towards one’s local community — that yield a brand of loyalty nothing else can buy.

Why not organize for it, organize for the mutual benefits of business and society with a detailed, step-by-step checklist you can take to your next team meeting?:

Download the 2018 Holiday Local SEO Checklist

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

What We Learned in August 2018: The Digital Marketing Month in a Minute

Posted by on Sep 6, 2018 in SEO Articles | Comments Off on What We Learned in August 2018: The Digital Marketing Month in a Minute

What We Learned in August 2018: The Digital Marketing Month in a Minute

The average Briton spends over 2 hours online per day

Several fascinating findings were revealed in Ofcom’s annual review of the UK communications sector. Key takeaways include:

On average, people in the UK spend the equivalent of a day online per week

Facebook’s reach among 18-24 year-olds is in decline (by 4% YoY)

The smartphone video advertising market is worth over 1 billion pounds

Read the full story (Linkedin)

Google risks class action due to “surreptitious” tracking of user location

A lawsuit was filed versus Google in California after it was found that Google still tracks a smartphone’s location, even when the “Location History” setting is turned off. Some of the implications may force Google to pay a substantial fine, together with deleting some its location tracking data. A few days after the lawsuit, the Location History support page on Google’s website was changed from “with Location History off, the places you go are no longer stored” to “some location data may be saved as part of your activity on other services, like Search and Maps.”

Read the full story (Marketing Land)

Google improves accuracy of Index Coverage report in Search Console

Google has updated the Index Coverage report within Search Console for the first time since its launch. This feature was originally introduced in 2017 in order to provide information on the pages of your website that have/have not been indexed (with instructions to fix issues). According to Google, the new update will heavily impact the accuracy of report starting from August. The only drawback of this refresh is that the index coverage data for the period July 14-August 1 was not fully recorded, so it was calculated as an estimate based on the values recorded on the 1st of August.

Read the full story (Search Engine Journal)

Google confirms core algorithm update

On August 1, Google confirmed industry rumours about remarkable ranking fluxes by stating the roll out of a core algorithm update. Named “medic update” in the SEO industry due to the large number of Health and Medical sites affected, it is worth mentioning how its reach expanded beyond just this bracket but affected the broader category of YMYL (Your Money or Your Life) sites.

Read the full story (Moz)

Study reveals truth behind shopping via voice search

Despite being one of the most hyped subjects of 2018, a recent study from The Information revealed that voice search does not yet seem to be driving sales. Only 2% of the customers who use Amazon’s Alexa intelligent assistant appear to have made a purchase via voice search, and only 10% of those have made a second purchase. This is probably due to an inefficient consumer journey, device limitations or simply that people are not generally aware of the capabilities yet. How soon might this change?

Read the full story (TechCrunch)

Schema markup for datasets now supported in the SERP

Google has confirmed that dataset markups will be supported in the SERP. By doing so, Google is trying to improve the way users visualise data in the search result page, rewarding organisations that mark up datasets such as tables, CSV files, images containing data and more.

As Google stated, this new markup aims “to improve discovery of datasets from fields such as life sciences, social sciences, machine learning, civic and government data, and more”.

Read our blog to learn how to understand and implement structured data.

Read the full story (Search Engine Land)

FB launches mobile-first video creation for better ads

Facebook has rolled out a new set of tools aimed at advertisers that produce assets with mobile-first in mind, since their research has proven that “mobile-first creative has a 27 percent higher likelihood of driving brand lift compared to ads that are not optimized for mobile.”

It is now possible to add motion to existing images/videos or create videos from assets such as pictures and logos.

Read the full story (Marketing Land)

Ad spend on Instagram on the rise

Despite Facebook having experienced the largest one-day fall in American stock market history on the 26th of July, the stock is still trading at May 2018 levels. In their latest earnings report, they disclosed strong growth in ad spend on Instagram in Q2, which was 177% compared to previous year.

Instagram’s customer base has surpassed a billion active users and, according to the social media listening company socialmediabakers, brands’ Instagram profiles have a much higher user engagement compared to their Facebook equivalents. While there are challenges around the advertising opportunities in the “stories” functionality, Facebook as a whole is continuing to see their playbook work supremely well.

Read the full story (Marketing Land)

Europe to fine sites for not taking down illegal content within one hour

The EU is planning to take a stronger position on illegal or controversial material posted online, especially on social media platforms such as Facebook, Twitter and YouTube. Julian King, EU Commissioner for Security, has put forward legislation which would fine tech companies that do not remove illegal content within one hour. This follows wide reporting of a recent study which found a correlation between social media usage and hate crimes (study based on violence against refugees in Germany in 2018) though this study has received some criticism in particular related to its use of global “likes” for Nutella as a proxy for German Facebook usage.

Read the full story (Business Insider)

Twitter’s new tests: threaded replies and online status indicators

Twitter’s director of product management, Sara Haider, has posted a few screenshots that display some new features Twitter is working on to improve threaded conversations and add online status indicators. The reason behind these changes is to make Twitter “more conversational”. Neither feature appears to be groundbreaking if compared to other social media platforms: threaded replies will potentially look very similar to Facebook’s comments, while online status indicators have been used by Facebook Messenger and Instagram’s direct messages. The tech giant is currently collecting feedback before rolling out the changes.

Read the full story (Search Engine Journal)

Distilled News

We kick off the Distilled news this month with a post from Craig Bradford explaining what SEO split testing is.  The post focuses on the simple principle of split testing, how it differs from CRO and UX testing and covers a few examples to outline Distilled’s methodology. As VP in charge of Distilled’s ODN platform, Craig has a front-row seat to this hot area of SEO.

Analyst Paola Didone took her recent experiences of handling large data sets and wrote up a blog post about how to use Regex formulae in Google Sheets.

From the US, SEO Analyst Shannon Dunn suggests an easy approach to optimizing website internal linking structures.

Outside the SEO world, our People Operations Executive Kayla Walker shares her thoughts on how to give better positive feedback.

Distilled’s CEO, Will Critchlow, tried to address the industry’s confusion on the differences between URL structures and Information Architecture.

Last but not least, SearchLove London is approaching! Get your tickets here and do not miss out on the newest and hottest topics in digital marketing. At the time of writing, only 1 VIP ticket is left and you only have two weeks to take advantage of our early bird pricing (£150 off!) which is only available until the 19th of September.

Need further convincing? Will has written up 8 reasons why Search Love London is worth attending.

Do 404s Hurt SEO and Rankings?

Posted by on Sep 3, 2018 in SEO Articles | Comments Off on Do 404s Hurt SEO and Rankings?

Do 404s Hurt SEO and Rankings?

Status code 404 is probably the most common HTTP error that people encounter when they’re browsing the web. If you’ve been using the internet for over a year, chances that you haven’t encountered one yet are pretty low. They’re very common.

 

Normally, people don’t pay too much attention to them. As a user, you will get frustrated at most and hit the back button or close the tab. As a webmaster, however, more things might be at stake. Many website owners ask themselves if 404 pages hurt their SEO and rankings in any way.

 

 

 

What Is Error Code 404?

How to Add/Customize a 404 Page
How to Find 404 Errors

Do 404 Pages Hurt SEO?

What Does Google Say About 404s?
Incoming 404s
How to Fix Incoming 404s
Outgoing 404s (Includes Internal 404s)
How to Fix Outgoing 404s

Building Backlinks with the Broken Links Method

 

Keep reading, as in this article we’ll go over how 404 pages affect your website, SEO and rankings and what you can do to fix things.

 
What Is Error Code 404?

 

Error 404 is a standard HTTP status code (also called response code). When you try to access a URL or a server, the server returns a status code to indicate how the operation went. Assuming that most of the web works fine, the most common status code is 200. If you’re reading this article now, it means that your browser was able to access our server and the server found the requested resource, so it returned a 200 response code.

 

When the client can establish a connection with the server but can’t find the requested resource, it pulls out a error 404 status code. It basically means that the page or whatever resource was requested cannot be found at that particular address.

 

To check the response code of a page you can right click anywhere on the page in your browsers, hit Inspect and then go to the Network section. If you can’t see the status codes, press the F5 key, or refresh the page while the inspector is still open.

 

Chrome Inspector

 

You will usually see a bunch of status codes there. That’s because a page will load multiple resources. For example, the requested page HTML/PHP file might be found, but some image resources have been misspelled or deleted. In this case, the page document type will return a 200 response code, while the missing image resources will return 404s.

 

A 404 status code in your browser will look something like this:

 

CognitiveSEO’s 404 Page

 

As you can see, we have a document type 404 error code, which means the page doesn’t exist or wasn’t found at that address, followed by two 200 status codes that represent a couple of images that have been found.

 

Another option would be to use a tool like https://httpstatus.io/. You can insert multiple URLs and it will return you their HTTP status codes. This will only pull out the main status code of the document, excluding any other resources. You can, however add the resource URL.

 

Response code tool

 

There are other response codes there that you might have heard of. 500, 501 and 503, for example, usually indicate a server error, while 301 and 302 stand for redirects. These, along with 200 and 404, make up the most common status codes on the web.

 

The 301s you see above in the tool and browser inspector are there because I’ve entered the HTTP version instead of the HTTPS version, so a 301 is performed by our server to redirect users to the secure version of our website. I’ve decided to leave them in the screenshots, because they’re a good example of how browsers and status code tools work.

 

It is really important for a page/resource that doesn’t exist to return a 404 status code. If it returns a 200 code, Google might index it.

 

However, to combat this, Google created a “Soft 404” label. Basically, if the page states that the content isn’t found, but the HTTP status code is 200, we have a soft 404. You can find these types of errors in Google’s Search Console (Former Webmaster Tools), under Crawl Errors. If you’re already on the new version of Search Console, the easiest way is to temporarily switch to the old one.

 

 

Soft 404s aren’t real error codes. They’re just a label added by Google to signal this issue or a missing page returning a 200 code.

 
How to Add/Customize a 404 Page

 

Normally, your web server should already handle 404s properly. This means that if you try to access a URL that doesn’t exist, the server will already pull out a 404.

However, sometimes the platform might not return a 404, but a blank 200 page. Also, as you can see above, the design isn’t very pleasant and the only option given is to refresh the page… which doesn’t exist. That will keep pulling up a 404 code.

 

It’s a very good idea to have a custom web design for your 404 page. Why? Well, because it can create a better experience for your users. I mean, the experience of not finding what you’re looking for is already bad. But you can add some humor to it, at least.

 

The most important part on your 404 page is to include a CTA (call to action).

 

Without a call to action, users will most probably leave when they see a regular 404. By inserting some links to some relevant pages, you can hopefully harvest some more traffic to your main pages.

 

Take a look at our example of 404 page. Big difference, isn’t it? It might actually convince you not to be upset with us. Also, we have a pretty clear CTA that tells you to click on it. It links to the homepage. Our homepage is our hub, from which you can access the most important and relevant parts of our website.

 

cognitiveSEO’s 404 Page Design

 

However, you don’t have to limit yourself to this. You can add links to relevant category pages or other sections of your site. A search bar would also be a great idea.

 

Be creative with your 404’s web design. If it puts a smile on the users’ faces, it might even be better than if they landed on the right page. You can take a look at a few examples in this article, to get your gears spinning.

 

If you have a cool 404 design, share your website with us in the comments section and let’s have a look at it!

 

Most popular CMS (Content Management Systems), like WordPress or Joomla, already have some sort of design implemented. You can easily add a custom design using a plugin. Here’s a plugin for WordPress.

 

If you have a custom built website, then you’ll have to create a 404 template. Log into your Apache web server and create a 404.php file. If you already have one, just edit that. Sometimes, it might have the .html extension. If it doesn’t return a 404 status code, change it to .php, because we’ll need to force the HTTP request header with the proper 404 error code using some PHP.

 

<?php
header(“HTTP/1.0 404 Not Found”);
?>

 

Then, find your .htaccess file and add the following line to it:

 

ErrorDocument 404 /404.php

 

This will tell the server which page should be shown when a 404 error code is detected. If the line is already there, just modify that. That’s it. Make sure you check everything again with your browser’s inspector or with the tool mentioned above. If it returns a 404 code, you’re good to go!

 
How to Find 404 Errors

 

An easy way to find 404 errors is to log into Google’s Search Console (Former Webmaster Tools). Those are the 404s that Google see, so they’re definitely the most important ones.

 

If you see Soft 404 errors, like mentioned above in the article, you have to make sure you 404 page actually returns a 404 error code. If not, it’s a good idea to fix that.

 

There are other ways to find 404 errors. If you’re looking for broken pages on your website, which other people have linked to, you can use the cognitiveSEO Site Explorer and check the Broken Pages section.

 

Screenshot from the CognitiveSEO Tool. More details about it below, in the article.

 

If you’re looking to find broken links within your own site, or links to other websites from your website, you can use Screaming Frog. A free alternative would be Xenu Link Sleuth.

 

I’ll show you how to use these SEO tools in detail below.

 
Do 404 Pages Hurt SEO?

 

There are a lot of experts out there stating that 404s will ruin your rankings and that you should fix them as soon as possible. But, the truth is that 404s are a normal part of the web and they are actually useful.

 

Think of it. If a specific place didn’t exist, wouldn’t you rather know it than constantly be directed to other random places? It’s the same on the web. While it’s a good idea to redirect an old page that’s been deleted to a new, relevant page, it’s not such a good idea to redirect every 404 to your homepage, for example. However, I’ve seen some sites redirect their users after a countdown timer, which I thought was a good idea.

 

In theory, 404s have an impact on rankings. But not the rankings of a whole site. If a page returns a 404 error code, it means it doesn’t exist, so Google and other search engines will not index it. Pretty simple, right? What can I say… make sure your pages exist if you want them to rank (ba dum ts).

 

So what’s all the hype about 404s? Well, obviously, having thousands and thousands of 404 pages can impact your website overall.

 

However, it’s not so much the actual 404 pages that hurt SEO, but the links that contain URLs pointing to the 404s.

 

You see, these links create a bad experience. They’re called broken links. If there were no broken links, there wouldn’t even be any 404 errors. In fact, you could say that there are an infinity of 404s, right? Just add a slash after your domain, type something random and hit enter. 404. But if search engines can’t find any links pointing to 404s, the 404s are… double non-existent. Because they already don’t exist… And then they don’t exist again. I hope you get the point.

 

I’ll explain everything in more detail soon, so keep reading.

 
What Does Google Say About 404s?

 

Google has always pointed out that 404s are normal. They also seem to be pretty forgiving with them. I mean, that’s natural, considering that they have 404s of their own:

 

 

In fact they’ve pointed these things out in an article from 2011 and also in this more recently posted video:

 

 

There’s also this source that also treats the issue:

 

 

If you want to read more on this, visit this link, then scroll to the bottom and open the Common URL errors dropdown.

 

However, let’s explain everything in more detail. People often forget that there two types of 404 pages. The ones on your site and the ones on other people’s website. They can both affect your site, but the ones that affect you most are the ones on other people’s websites.

 

“What? Other websites’s 404s can impact my website?”

 

Yes, that’s right. If your website links to other websites that return a 404, it can negatively impact its rankings. Remember, it’s not so much the 404s that cause the trouble, but the links to the 404s. No links to 404s, no 404s. So you’d better not create links to 404s.

 
Incoming 404s

 

Incoming 404s are URLs from other websites that point to your website, but return a 404. Incoming 404s are not always easy to fix. That’s because you can’t change the URLs on other websites, if you don’t own them. However, there are workarounds, such as 301 redirects. That should be kept as a last option, in case you cannot fix the URL.

 

These don’t really affect you negatively. I mean, why should you be punished? Maybe someone misspelled it, or maybe you deleted the page because it’s no longer useful. Should you be punished for that? Common sense kind of says that you shouldn’t and Google agrees.

 

However, this does affect your traffic, as when someone links to you, it sends you visitors. This might lead to bad user experience on your side as well. You can’t always change the actions of others, but you can adapt to them and you can definitely control yours.

 

Most webmasters will be glad to fix a 404, because they know it hurts their website. By sending their users to a location that doesn’t exist, they’re creating a bad experience.

 

If you’ve deleted a page with backlinks pointing to it (although it’s not a good idea to delete such a page) you must make sure you have a 301 redirect set up. If not, all the link equity from the backlinks will be lost.

 

If you don’t redirect backlinks to broken pages on your website to relevant locations, you won’t be penalized or anything, but you will miss out on the link equity.

 

A 301 is mandatory, because often you won’t be able to change all the backlinks. Let’s take social media, for example. On a social media platform like Facebook, one post with a broken link could be shared thousands of times. Good luck fixing all of them!

 

You could also link to your own website with a 404, from your own website. Broken internal linking is common on big websites with thousands of pages or shops with dynamic URLs and filters. Maybe you’ve removed a product, but someone linked to it in a comment on your blog. Maybe you had a static menu somewhere with some dynamic filters that don’t exist anymore. The possibilities are endless.

 
How to Fix Incoming 404s

 

Fixing incoming 404 URLs isn’t always very easy. That’s because you’re not in full control. If someone misspells a link pointing to your website, you’ll have to convince them to fix it. A good alternative to this is to redirect that broken link to the right resource. However, some equity can be lost in the process, so it’s great if you can get them to change the link. Nevertheless, the 301 is mandatory, just to make sure.

 

If you’ve deleted a page, you can let those webmasters know that link to it. Keep in mind that they might not like this and decide to link to another resource. That’s why you have to make sure that the new resource is their best option.

 

To find incoming broken links, you can use cognitiveSEO’s Site Explorer. Type in your website, hit enter, then go to the Broken Pages tab.

 

 

If you click the blue line, you can see what links are pointing to your 404 URL. The green line represents the number of total domains pointing to it. Some domains might link to your broken page multiple times. For example, the second row shows 33 links coming from 12 domains. The green bar is bigger because the ratio is represented vertically (the third green bar is 4 times smaller than the second green bar).

 

Then, unfortunately, the best method is to contact the owners of the domains and politely point out that there has been a mistake. Show them the correct/new resource and let them know about the possibility of creating a bad experience for their users when linking to a broken page. Most of them should be happy to comply.

 

Whether you get them to link to the right page or not, it’s a good idea to redirect the broken page to a relevant location. I repeat, a relevant location. Don’t randomly redirect pages or bulk redirect them to your homepage.

 

It’s also a good idea to do a background check on the domains before redirecting your URLs. Some of them might be spam and you might want to add them to the disavow list.

 

Remember, 404s should generally stay 404. We only redirect them when they get traffic or have backlinks pointing to them. If you change a URL or delete a page and nobody links to it or it gets absolutely no traffic (check with Google Analytics), it’s perfectly fine for it to return a 404.

 
Outgoing 404s (Includes Internal 404s)

 

Outgoing 404s are a lot easier to fix because you have complete control over them. That’s because they’re found on your own website. You’re the one linking to them. Sure, someone might have screwed you over by deleting a page or changing its URL, but you’re still responsible for the quality of your own website.

 

The only type of 404 links that really hurt your website are the ones that are on it. When you add a link from your website to another website, you have to make sure that URL actually exists or that you don’t misspell it. You might also have internal links that are broken. Similar to shooting yourself in the foot.

 

Broken links create bad user experience and we all know that Google (and probably other search engines as well) cares about user experience.

 

Google crawls the web by following links from one site to another, so if you tell Google “Hey man, check out this link!” only for it to find a dead end, I’m pretty sure whom Google’s going to be mad about.

 

That’s why, from time to time, it’s a good idea to check if you’re not linking out to 404s. You never know when one shows up. The best way to do it is to use some software that crawls your website. 

 
How to Fix Outgoing 404s

 

Fixing outgoing 404s is easier because you have full control over them. They’re on your site, so you can change them.

 

To find them, you can use either Screaming Frog or Xenu Link Sleuth. I know Xenu looks shady, but it’s safe, it works and it’s free.

 

If you have a Screaming Frog subscription, go ahead and crawl your website. The free version supports 500 URLs, but a new website with under 500 URLs rarely has broken links. After the crawl is finished (it might take hours or even days for big sites), go check the Response Code tab and then filter it by searching for 404. At the bottom, go to the Inlinks section to find the location of the broken URL on your website.

 

 

 

Another way to do it is to go to the External tab, but there you won’t find the internal broken links. To find its location, go to Inlinks, again.

 

 

If you want to use a free alternative, go for Xenu. However, things are a little more complicated with Xenu. Xenu doesn’t point out pretty much anything else other than URLs and their status codes. It also doesn’t always go through 301s to crawl your entire site, so you’ll have to specify the correct version of your site, be it HTTP or HTTPS, www or non-www.

 

To begin the crawl, go to File -> Check URL. Then enter your website’s correct main address and hit OK. Make sure that the Check External Links box is checked.

 

 

After the crawl is done, you can sort the list by status codes. However, a better way is to go to View and select Show Broken Links Only. After that, to view the location of the broken link on your site, you’ll have to right click and hit URL properties. You’ll find all the pages that link to it.

 

Unfortunately, I haven’t found a proper way of exporting the link locations, so you’re stuck with right clicking each link manually.

 

After you’ve located the links with either Xenu or Screaming Frog, edit them in your admin section to point them to a working URL. You can also just 301 them, but some link equity will be lost so the best thing to do is to fix the links themselves. Just remember that the 301 redirect is mandatory.

 
Building Links with the Broken Links Method

 

These 404s, always a struggle, aren’t they? That’s true, but there’s also a very cool thing about 404s. The fact that you can exploit them to build new links.

 

Sounds good, right? Let me explain.

 

Wouldn’t you like someone to point out to you a broken link on your site? I’d certainly like that. What if then, they’d even go further as to give you a new resource to link to, one even better than the one you were linking to before? Would you consider linking to it?

 

Well, if you find some relevant sites that link to broken pages, you might as well do them a favor and let them know. And how can you do that, exactly? Well, you can use the Broken Pages section of CognitiveSEO’s Site Explorer, of course.

 

 

However, you’ll also need some great content to pitch them if you want this to work. If you don’t have that, they won’t bother linking to you. They’ll just remove the broken link and thank you for pointing it out. So, if you aren’t already working on a great content creation strategy, you should get started.

 

The secret to broken link building, however, is to have awesome content that they can link to.

 

Once you find a website linking to a broken page, all you have to do is email them something like this:

 

Hey there, I was checking your site and followed a link but it leads to a page that doesn’t exist. You might want to fix that, as it creates a bad experience for your users. Also, if you find it appropriate, I have a pretty good resource on that topic you could link to. Let me know if you like it.

 

I’d go one step further and actually search the site which has been linked to for the resource. If it’s there, at a new location, point that out before your article. You’ll have more chances of them trusting you this way. Your article will be an alternative. Also, if the old resource is worse, they’ll be able to compare them and see the difference.

 

The broken link method is one of the best SEO strategies for link building. If you want to learn more about this method and how to apply it effectively, you can read this awesome article about broken pages link building technique.

 

Conclusion

 

So, if you were wondering if 404 errors hurt SEO, now you know the answer. Anyway, let me summarize it:

 

404 error pages don’t really hurt your SEO, but there’s definitely a lot you can miss out if you don’t fix them. If you have backlinks pointing to pages on your website that return a 404, try to fix those backlinks and 301 redirect your broken URLs to relevant location. If you have links on your site that point to broken pages, make sure you fix those as soon as possible, to maximize the link equity flow and UX.

 

What are your experiences with 404 pages? Do you constantly check your website for 404s? Have you ever used the broken pages link building method mentioned above? Let us know in the comments section!

The post Do 404s Hurt SEO and Rankings? appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

4 warning signs AdSense is ruining your contextual advertising strategy

Posted by on Aug 30, 2018 in SEO Articles | Comments Off on 4 warning signs AdSense is ruining your contextual advertising strategy

4 warning signs AdSense is ruining your contextual advertising strategy

In the dark ages of the SEO era, when bloggers and webmasters were still familiarizing themselves with the process and its functionality, certain tactics and strategies had become industry standards.

The era I’m talking about is the one where Google AdSense was heavily built into the foundation of a blogger’s strategy. The “legacy” tactics associated with this approach can still be found in the way modern publishers think about SEO and branding strategy. However, AdSense’s limited customizability can hold back publishers. This needs be addressed and rooted out.

Before assuming AdSense is the best monetization partner for you, consider these four warning signs. If you’re guilty of practicing any of these points, it’s time you re-evaluated your monetization partner and strategy.

1. You haven’t considered other platforms

It’s no secret that AdSense as a standalone monetization stream isn’t enough to earn substantial revenue. Most solopreneurs that still operate in the “blogosphere” have understood for years that it is important to branch out and diversify revenue streams. So there’s nothing revolutionary about this concept.

Most of the focus on diversification has been on developing products to sell, with eBooks being a gold standard. This is great advice, even if it can become a bit boilerplate at times. But we’re not talking about selling products today. We’re talking about contextual advertising, which means placing relevant ads on your site that fit in with the content of your page. When it comes to contextual advertising, too many people still aren’t considering their other options.

Media.net, the second largest contextual advertising business worldwide by revenue, is a good place to start experimenting. The platform uses machine-learning algorithms to predict user intent, based on the content of your pages, and serves ads based on the associated keywords. With Media.net you get exclusive access to Yahoo! Bing’s $6 billion worth of search demand. This allows you to leverage quality advertisers even if you are in a smaller niche.

Performance is obviously different for every site, but Perrin Carrell of AuthorityHacker claims Media.net ads earns them 5x as much as AdSense ads, and Jon Dykstra of FatStacksBlog reported that some Media.net ad placements were earning more revenue than all other ad networks.

One of the biggest advantages of Media.net ads is that their ads are heavily customizable. Sizes and designs can be designed to match your site so that they are native to your content and inline with your branding, resulting in higher engagement and revenue. Native ads are a great way to offer your readers an uninterrupted experience since these ads look like a natural extension of your website. In fact, these ads are also mobile responsive, which means more revenue for you.

Media.net Native Ad Unit

 

Media.net Contextual Ad Unit

From there, you can also consider ad servers like the Google Ad Manager (formerly DoubleClick For Publishers) and OpenX. Ad server platforms like these give publishers great control over ads, including the ability to set preferred deals with a CPM floor, and the option to interact directly with the ad marketplace.

In short, if AdSense is the only ad platform you’ve experimented with, you are missing out on great revenue-generating opportunities.

2. You are picking topics based on AdWords keyword bids

The SEO industry grew up on the Google AdWords Keyword Tool, and its successor, the Keyword Planner. One trend, born in the age of “Made For AdSense” (MFA) blogs and microsites, was to use the Keyword Planner to discover topics to write about based on AdWords bid prices.

This approach was never a good long-term strategy. A blog based on topics chosen to optimize revenue according to this approach often leads to a poorly branded content site that doesn’t naturally adapt to the needs of its audience. The obviously commercial intent of the topics chosen puts a hard ceiling on the size of your recurring audience.

Search engines like sites that build recurring audiences. They earn higher click through rates from the search engines, which Googlers have admitted are used to judge SERP quality.

Modern content creators need to select topics based on what will most successfully help them cultivate an audience. This means identifying keywords that address specific problems you can help users solve. 

You do not find these topics by typing generic industry keywords into the Keyword Planner. You find them by identifying your audience and the platforms they frequent, the kind of questions they ask one another, or even asking them directly what they are most frustrated with, and looking for satisfaction gaps in the answers to those questions. Only then should you turn to the Keyword Planner to start looking for the best keywords to represent your solutions.

The goal isn’t to target valuable keywords, but to target valuable audiences. This is a crucial difference that should guide your strategy at a more foundational level.

3. Your ad placement is based on MFA “best practices” instead of testing

“Best practices” rooted in old school MFA thinking prevent you from building your own monetization strategy from the ground up. They can also hurt your rankings in the search results.

Damaged Rankings

Old school, “gray hat” MFA tactics like trying to place ads where they will be confused for navigation rather than placing them depending on your layout and content were never good branding strategies, and simply don’t work anymore.

Google’s internal quality rater guidelines explicitly state that sites should never disguise advertisements as the main content or navigation of the site, and if they do they will receive the “lowest” quality rating. Likewise for ads that make the main content unreadable, as well as ads that are distracting because they are too shocking.

Bad Strategy

Even advice that seems innocuous and doesn’t violate search guidelines can be harmful.

Recommendations like “place your ad in the sidebar,” “place it within your content just above the fold,” or “use the 300×250 ad size” are often unhelpful and counterproductive. Advice this specific shouldn’t be given without context, because ads should be placed in a way that fits your site design.

Suggestions like these are always hypotheses that you should test, not rules written in stone. Run your own A/B tests to find out what works for you.

We recommend Google Analytics Experiments for your testing because their Bayesian statistical methods make it easier to interpret results, because they are free, and because the data is as fully incorporated into Google Analytics as possible.

4. You are not partnering with sponsors

This is one of the biggest opportunities you miss out on if you operate on an AdSense-focused monetization strategy. When you work with sponsors, you can work advertisements entirely into the main content of your blog post, or host articles that are sponsored content created by sponsors themselves. You can negotiate deals that will guarantee a certain level of revenue, which is not always possible using programmatic advertising.

You can collaborate with sponsors on innovative campaigns that will earn the sponsor far more attention than traditional ads, which naturally means they will be willing to spend more. Innovative approaches can also result in more exposure not just for your sponsor, but even for your own brand.

It also lets you monetize on channels where AdSense won’t, such as your social media platforms.

If you aren’t reaching out to potential sponsors to discuss possibilities like these, you are missing out on substantial revenue.

Conclusion

AdSense should not be thought of as central to your contextual advertising strategy, or worse, the foundation of how you approach brand building. Diversify your advertising platforms, migrate your market research outside of AdSense’s native tools, and rely on your own testing strategies. Let your brand drive your monetization strategy, not the other way around.

Manish Dudharejia is the president and founder of E2M Solutions Inc, a San Diego based digital agency that specializes in website design & development and ecommerce SEO. Follow him on Twitter.