SEO Articles

3 SEO Split Tests You Should Try

3 SEO Split Tests You Should Try

Yes, split testing for SEO is a thing, and a powerful one at that. In How Split Testing Is Changing Consulting, Will sums up why high priority SEO changes linger in developer backlogs, and how we’re addressing these issues with our ODN platform that allows us to test and roll out these recommendations without using our clients’ developer resources: we can substantiate best practices like H1 changes, alterations to internal links, and rendering content with and without Javascript.

Let’s get started with three tests you should try to see if you can increase organic traffic to your site.

1. Do H1 changes still work?

It won’t come as any surprise to SEOs that testing on page elements can produce significant changes in rankings. That said, I’ve found that folks can put too much stock in on page elements: we tend to get keyword-tunnel vision and chock up our rankings to keyword targeting alone. As a result, being able to test these assumptions on Google can help (dis)prove our hypotheses (and help us prioritize the right development work).

For iCanvas.com, prioritizing web development work is key: they’re a canvas print company with a robust team of developers, but like most companies, they have limited resources to test technical changes. As a result, dubious SEO-driven changes can’t be prioritized over user experience-driven ones.

We did, however, notice that iCanvas was not targeting product type in their H1 tags. As a result, this is what a typical category page (like this one) looked like.

Here, the H1 tag was simply “Beach Decor.” iCanvas was communicating the style and subject of their products in their title tags–that product being canvas art prints–but that context was lost on a given category page. We hypothesized that if we told the world (and, more specifically, Google) what the products are (canvas prints), that we would better meet users’ search intents resulting in more organic search traffic to our test pages. Here’s what the H1 looked like for the test::

After less than a month, we had our answer: our test pages with canvas prints appended to H1 tags gained significantly more traffic than our control pages. How’d we measure that?

It helps to know how ODN works (also check out Craig’s post, What is SEO Split Testing?). The most important thing to know in understanding the chart above is that ODN observes the organic traffic your site captures in real time to develop a forecast for the organic traffic we’d expect to receive in the future. That’s how we got to the nice “7.7% uplift if rolled out” estimate. There is of course volatility–forecasts are rarely perfect, and ours isn’t an exception. Which is why we also measure statistical significance within the normal range of variance we’d expect.

As a result, we were confident that this change would positively impact traffic to their site, so we declared this test a winner and rolled the change out to all of their category pages through ODN. This meant that we didn’t have to hijack our developers’ work queue in order to see an immediate benefit. Additionally, we had evidence we could bring to our devs instead of relying exclusively on the promise of following “best practices” in keyword targeting.

2. Will altering internal links give you a big payoff?

Testing changes to internal links is often an ill-defined endeavor. Do you measure changes to PageRank (dubbed local PageRank by Will Critchlow)? Should you look at your log files to observe changes to Google’s crawling behavior?

In our case, iCanvas had a somewhat simpler internal linking issue we wanted to address: self-referential links. As an art company, it’s essential to attribute the creator’s name to their work of art.

As a result, they had made the decision to include a link to the artist of the work on every product listing.

For instance, in the above screenshot of a category page, you can see that each product has its artist listed, and those artists’ names are linked to pages listing all of their available artworks on iCanvas. While this application made sense for category pages where various artists’ products are featured alongside each other, it resulted in redundant links on those individual artists’ pages.

Each of these artist attributions, on the artist’s category page, were linking back to themselves (thus: self-referential links). Our hypothesis was that if we removed these redundant links, we’d better consolidate our PageRank. We knew this change could have a dramatic impact on artists’ products, resulting in more organic traffic flowing to their product pages. Our test, however, would measure the impact of organic traffic acquisition to our test group of artist pages. So how did it turn out?

As it turned out, our test was a success: artist pages in our test group received more organic traffic than our control pages. We were again able to test something that would’ve been touted as “best practice” before rolling it out sitewide, or manually setting up test and control groups and measuring the results ourselves. Once we saw the positive impact (less than a month later), we rolled this change out sitewide and the validation we needed to get the necessary development work prioritized.

3. How good is Google at crawling JavaScript?

If you follow our blog, you’ve already read about how we tested Google’s ability to crawl and render JavaScript. We posited that, because Google wasn’t reliably displaying iCanvas’ products in its Fetch and Render tool, iCanvas’ category and product pages would receive more organic traffic if we used a CSS trigger to load their products instead of relying exclusively on JavaScript.

Above is a screenshot of what we saw (and, presumably, what Googlebot saw) in Fetch and Render of a category page.

After our tweak, however, we plugged one of our test URLs into Fetch and Render, and we could finally produce what users see in their browsers with JS enabled. But did it actually result in additional organic traffic to our test pages?

As you can see above, it did. Based on the performance of our test pages, iCanvas would see an extra 88 pageviews daily with their products triggered through a line of CSS instead of JS. Measuring the impact of this relatively simple change could have taken much longer than this month-long experiment. By the end though, we were ready to roll this out sitewide to ensure that all iCanvas products were crawlable and discoverable.

Split testing something as simple as on page SEO can produce meaningful traffic changes that’ll allow you to validate best practices and get necessary evidence for your stakeholders (and developers) to buy into your suggestions. Is it time for you to try SEO split testing?

Read More

Announcing Full-Funnel Testing – testing SEO and CRO at the same time

Announcing Full-Funnel Testing – testing SEO and CRO at the same time

Until now it’s not been possible to measure the impact of SEO and CRO at the same time. Today we’re proud to announce a new feature of Distilled’s Optimisation Delivery Network that we’re calling full funnel testing.

Our ODN platform launched with a focus on SEO testing. You have probably thought about this by comparing it to tools like Optimizely that allow you to do CRO testing. If you want to know more about how SEO testing works and how it’s different to CRO, you can read more in this post on what is SEO testing.

The trouble with just using one or the other is you don’t have any insight into how they impact each other.

That’s a big problem because we know from our testing that a lot of SEO changes impact conversion rate and a lot of CRO changes (even when they increase conversion rate) can negatively impact organic traffic. If you haven’t read it already, you should check out Will’s blog post on the impact of rolling out negative SEO changes but here’s an example of when it goes wrong. This chart shows the search impact of a suggested CRO change on SEO. It decreased organic traffic by 25%.

For that reason, we see the relationship between SEO and CRO like this: 

We saw a need to be able to measure SEO and CRO at the same time. For the last few months, we’ve been running a beta version for some of our clients of what we are calling “full-funnel testing”. Today we’re opening that feature up to everyone and we’d like to show you how it works.

How does it work?

Let’s look at CRO first. To run a CRO experiment, we cookie users based on the landing page design that they arrive on, they’ll then always see that version when they move between pages.

The result is we know the impact on conversion rate, but we don’t know the impact on SEO.

When we do pure SEO testing, we split pages, not users and look at the different impacts on search traffic to the control and variant pages:

The result of this framework is that we know the impact on SEO but we don’t know the impact on conversion rate:

A new framework – Full-funnel testing

With full funnel testing, the site is set up initially in the same way as in the pure SEO testing scenario – and then when someone arrives on a landing page, the SEO testing part of the experiment is complete:

We can then pivot into a CRO experiment by dropping a cookie for that user to make sure they see the same template that they first landed on when moving between pages:

Note that, having landed on the Unicorns page initially, they now see the “A” template version on all subsequent pageviews even on pages like Cats and Badgers that would be set up with the “B” template for anyone landing directly on them as a new visitor:

The result is that we are able to measure the impact of changes on SEO and CRO at the same time.

Thanks for making it this far, you can expect to hear more about this as we get more examples of full-funnel tests and start to share what we learn. If you’d like to know more or see a demo, reach out to us here.

Read More

Google + Is Shutting Down. How Does It Impact Your SEO?

Google + Is Shutting Down. How Does It Impact Your SEO?

Google +? Have you forgotten about it, too? While many of you seem to have been disregarding Google + over the past few years, it was there. But now, as the latest news confirms, we’ve found out that it will be shut down after user information was exposed. It is no shock for some to hear that it will be locked down, while for others it is sad news.

 

All in all, the news might affect lots of business owners. We’ve talked on our blog before, on multiple occasions, about the effect of social signals for your website. Now, it is time to see what has actually happened and what’s next.

 

 

All the fuss was powered by the news saying that Google + will be shut down after user data leak. Sources say over 500,000 users are affected because Google had leaked private information to third-party app developers between 2015 and March 2018. Google did not tell about the security breach they had in March 2018 and that came backstabbing them.

 

The problem is even more serious if we recall the similar situation that happened earlier this year, when Facebook acknowledged that Cambridge Analytica, a British research organization that had performed work for the Trump campaign, had inadequately got access to the personal information of up to 87 million Facebook users.

 

How Google’s Officials Treat the News
Will Shutting Down Google + Affect Your Business?
What’s Next in Your Social Media Strategy

 
1. How Google’s Officials Treated the News

The decision to stay quiet drew the attention of the cybersecurity community because the laws in California and Europe say a company must disclose a security episode.

 

On the other hand, Google’s decision to stay quiet was taken because it didn’t interfere with the company’s “Privacy & Data Protection Office” and it was not legal to report it. The giant mentioned in a blog post that nobody gained access to user information.

 

We found no evidence that any developer was aware of this bug, or abusing the API, and we found no evidence that any Profile data was misused.

Google

 

 

Applications made by other companies had access to Profile fields that were shared with the user, but not marked as public. Google’s officials said that:

 

This data is limited to static, optional Google+ Profile fields including name, email address, occupation, gender and age. It does not include any other data you may have posted or connected to Google+ or any other service, like Google+ posts, messages, Google account data, phone numbers or G Suite content.

Google

 

 

So apps did not have access to phone numbers, messages, Google Plus posts or data from other Google accounts. And they didn’t find any evidence that outside developers found the breach and the issue was fixed in March.

 

The funny thing in this situation is the fact that Google’s top managers stopped posting on Google + up to 3 years ago, in 2015. Which is kind of strange, because a Wall Street Journal report showed Google exposed user data around that date, as mentioned previously.

 

Larry Page, the co-founder of Google, had his last post on 21 August 2015, as you can see in the next screenshot. It seems like he gave up on Google + a long time ago.

 

 

Since 30 June 2011, Larry had posted 147 times on his Google plus account. And that’s so little! If you take into consideration that until 2015 (when his last post was made) he had 147 posts, we could make a large assumption and say it’s like he posted 3 times per month.  

 

Let’s take a look at the second co-founder, Sergey Brin’s Google + page. His last post isn’t published so long ago – on 9 September 2017. From all Google management, he’s the one that used Google + for a longer time. His last post was a photo within the Ragged Islands in the Bahamas, made just a few hours before it bore the brunt of Hurricane Irma.

 

 

A few years ago, Sergey Brin said that he is not a very social person and hadn’t spent much time on Facebook and Twitter, Google +’s competition. We discovered that more recently he lost his Twitter account (@sergeybrinn – is a suspended account). There are some voices that say he had a secret personal Facebook page, but we couldn’t find it. There are a lot of fake accounts instead.  

 

Although he expressly said he’s not more of a social person, he is the one that used Google plus the most from all Google’s officials.  

 

Sundar Pichai, the CEO of Google, last posted on 9 March 2016 about Google’s Deep Mind challenge. He was the second one who gave up on Google +, after Parry Page.  

 

 

And while his Google plus account was left in ruin, his Twitter profile is flourishing. He has 3.69M followers on Google + and only 2.02M on Twitter. Almost ~260 posts and over 1k on Twitter.

 

 

Former Executive Chairman of Google, Eric Schmidt, quit posting on 17 February 2017, long after he left Google in 2015.

 

 

On the other hand, on Twitter the last post was on 3 October 2018. Even if he is posting less often, he’s more active on Twitter than Google +.

 

It’s sad to see that even Google’s management left the social network to have a free fall. The bigger shock is the fact that none of Google’s six independent board members have ever posted publicly on Google+, according to Mashable.

 

Source: mashable.com

 

It makes you wonder when you see these results because Google + has a high influence on business. And the fact that it wasn’t refurbished explains why it got to this point. Sadly, the data breach was inevitably in this case … So, the question that we have is “How could the decision to shut down Google + influence your business and does it affect your SEO?”

 
2. Will Shutting Down Google + Affect Your Business?

 

Probably you’re thinking:

 

If Google’s top representatives don’t use Google+, then why did we?

 

The answer is simple: because of the influence social signals might have.

 

We’ve shown you multiple times that social signals are very important to drive awareness and create authority for a website to push it in SERP. In 2016, we analyzed 23 million shares to see their impact on rankings. We discovered that the average Google+ shares for the 1st rank is significantly higher, so they (might) have value for pushing pages to higher positions in Google.

 

 

Moreover, higher rankings are correlated with Facebook, Google +, LinkedIn & Pinterest high shares altogether. Now that you are losing one link to the chain, it might affect your website. But if all websites lose the same chain, wouldn’t that make loss an equal part for all? Well, that depends. We saw that micro-content that ranks 1st is correlated with high G+ shares.

 

 

Google launched plus one button for websites in 2011. Voices were saying that it used them as ranking signals used for search quality and rankings. But Google denied the allegation and said they had never used Google+ or plus ones as a ranking signal.

 

Google + started as a promising project, but it had a slow death, with nothing intriguing to offer. It was a time when Google tried to make people use it and drive conversations on the social network, by highlighting Google+ content in search results and in Google News, showing you the discussions on Google+. But in the end, it was all for nothing and even made users make fun of it in the SEO community.

 

After a few attempts to refresh the social network, Google stopped pushing it to users and slowly lost interest in it.  

 

The explanation comes in Google’s blog post where they made the announcement of closing the social network, they acknowledged it didn’t receive the fame despite all the efforts. 

 

Our engineering teams have put a lot of effort and dedication into building Google+ over the years, it has not achieved broad consumer or developer adoption, and has seen limited user interaction with apps.

Google

 

Up to this point, we can say that if Google’s algorithms are updated properly so they don’t let G+ signals influence rankings, then any website should be affected. That would be the desired situation. And as I mentioned before, Google + accounts will disappear for all websites so it would be an equal loss. Looking at this with a critical eye, we can say that there might be some cases where some websites could encounter a slight impact. For example, for websites that used mainly G+ to promote their business.

 
3. What’s Next in Your Social Media Strategy

 

What is Google planning to do in the near future now that it has lost Google +? Should we expect to see new Google products or something similar to this one? What we know is what Google stated.

 

All users have time until August 2019 to save their data. So take whatever you need before summer when it will be closed indefinitely. During this time, Google said it will offer additional information to users to help them download that data or migrate it.

 

Google might create new products or features, but only for businesses because as they said, the main focus now is to provide enterprise facilities.

 

We’ve decided to focus on our enterprise efforts and will be launching new features purpose-built for businesses. We will share more information in the coming days.  

Google

 

 

We’ll just have to wait to see what will happen, but one thing is clear: Google says that businesses shouldn’t suffer from this decision as they will come up with a solution. 

The post Google + Is Shutting Down. How Does It Impact Your SEO? appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

Read More

Marginal losses: the hidden reason your SEO performance is lagging

Marginal losses: the hidden reason your SEO performance is lagging

Without a structured testing program, our experience shows that it’s very likely that most SEO efforts are at best taking two steps forward and one step back by routinely deploying changes that make things worse.

This is true even when the thinking behind a change is solid, is based on correct data, and is part of a well-thought-out strategy. The problem is not that all the changes are bad in theory – it’s that many changes come with inevitable trade-offs, and without testing, it’s impossible to tell whether multiple small downsides outweigh a single large upside or vice versa.

For example: who among us has carried out keyword research into the different ways people search for key content across a site section, determined that there is a form of words that has a better combination of volume vs competitiveness and made a recommendation to update keyword targeting across that site section?

Everyone. Every single SEO has done this. And there’s a good chance you’ve made things worse at least some of the time.

You see, we know that we are modelling the real world when we do this kind of research, and we know we have leaky abstractions in there. When we know that 20-25% of all the queries that Google sees are brand new and never-before-seen, we know that keyword research is never going to capture the whole picture. When we know that the long tail of rarely-searched-for variants adds up to more than the highly-competitive head keywords, we know that no data source is going to represent the whole truth.

So even if we execute the change perfectly we know that we are trading off performance across a certain set of keywords for better performance on a different set – but we don’t know which tail is longer, nor can we model competitiveness perfectly, and nor can we capture all the ways people might search tomorrow.

Without testing, we put it out there and hope. We imagine that we will see if it was a bad idea – because we’ll see the drop and roll it back. While that may be true if we manage a -27% variant (yes, we’ve seen this in the wild with a seemingly-sensible change), there is a lot going on with large sites and even a large drop in performance in a sub-section can be missed until months after the fact, at which point it’s hard to reverse engineer what the change was. The drop has already cost real money, the downside might be obscured by seasonality, and just figuring it all out can take large amounts of valuable analysis time. When the drop is 5%, are you still sure you’re going to catch it?

And what if the change isn’t perfect?

The more black-box-like the Google algorithm becomes, the more we have no choice but to see how our ideas perform in the real world when tested against the actual competition. It’s quite possible that our “updated keyword targeting” version loses existing rankings but fails to gain the desired new ones.

Not only that, but rankings are only a part of the question (see: why you can’t judge SEO tests using only ranking data). A large part of PPC management involves testing advert variations to find versions with better clickthrough rates (CTR). What makes you think you can just rattle off a set of updated meta information that correctly weights ranking against CTR?

Our testing bets that you can’t. My colleague, Dominic Woodman discussed our ODN successes and failures at Inbound 2018, and highlighted just how easy it can be to dodge a bullet, if you’re testing SEO changes.

What I learned From Split Testing – Inbound 2018 Snippet from Distilled
We’re talking about small drops here though, right?

Well firstly, no. We have seen updated meta information that looked sensible and was based on real-world keyword data result in a -30% organic traffic drop.

But anyway, small drops can be even more dangerous. As I argued above, big drops are quite likely to be spotted and rolled back. But what about the little ones? If you miss those, are they really that damaging?

Our experience is that a lot of technical and on-page SEO work is all about marginal gains. Of course on large sites with major issues, you can see positive step-changes, but the reality of much of the work is that we are stringing together many small improvements to get significant year-over-year growth via the wonders of compounding.

And in just the same way that friction in financial compounding craters the expected gains (from this article of the effect of fees on investment returns):

If you’re rolling out a combination of small wins and small losses and not testing to understand which are which to roll back the losers, you are going to take a big hit on the compounded benefit, and may even find your traffic flatlining or even declining year over year.

You can’t eyeball this stuff – we are finding that it’s hard enough to tell apart small uplifts and small drops in the mix of noisy, seasonal data surrounded by competitors who are also changing things measured against a moving target of Google algorithm changes. So you need to be testing.

No but it won’t happen to me

Well firstly, I think it will. In classroom experiments, we have found that even experienced SEOs can be no better than a coin flip in telling which of two variants will rank better for a specific keyword.  Add in the unknown query space, the hard-to-predict human factor of CTR, and I’m going to bet you are getting this wrong.

Still don’t believe me? Here are some sensible-sounding changes we have rolled out and discovered resulted in significant organic traffic drops:

Updating on-page targeting to focus on higher-searched-for variants (the example above)
Using higher-CTR copy from AdWords in meta information for organic results
Removed boilerplate copy from large numbers of pages
Added boilerplate copy to large numbers of pages

Want to start finding your own marginal gains? Click the button below to find out more about ODN and how we are helping clients find their own winners and losers.

CONTACT US TO FIND OUT MORE ABOUT ODN

Read More

What is an XML sitemap and why should you have one?

What is an XML sitemap and why should you have one?

A good XML sitemap acts as a roadmap of your website which leads Google to all your important pages. XML sitemaps can be good for SEO, as they allow Google to quickly find your essential website pages, even if your internal linking isn’t perfect. This post explains what XML sitemaps are and how they help you rank better.

What are XML sitemaps?

You want Google to crawl every important page of your website, but sometimes pages end up without any internal links pointing to them, making them hard to find. An XML sitemap lists a website’s important pages, making sure Google can find and crawl them all, and helping it understand your website structure:

Yoast.com’s XML sitemap

Above is Yoast.com’s XML sitemap, created by the Yoast SEO plugin and later on we’ll explain how our plugin helps you create the best XML sitemaps. If you’re not using our plugin, your XML sitemap may look a little different but will work the same way.

As you can see, the Yoast.com XML sitemap shows several ‘index’ XML sitemaps: …/post-sitemap.xml, …/page-sitemap.xml, …/video-sitemap.xml etc. This categorization makes a site’s structure as clear as possible, so if you click on one of the index XML sitemaps, you’ll see all URLs in that particular sitemap. For example, if you click on ‘…/post-sitemap.xml’ you’ll see all Yoast.com’s post URLs (click on the image to enlarge):

Yoast.com’s post XML sitemap

You’ll notice a date at the end of each line. This tells Google when each post was last updated and helps with SEO because you want Google to crawl your updated content as soon as possible. When a date changes in the XML sitemap, Google knows there is new content to crawl and index.

Even better SEO with Yoast SEO Premium!

Optimize your site for the right keywordsNever a dead link in your site againPreviews for Twitter and FacebookGet suggestions for links as you write$89 – Buy now ▸ More infoIf you have a very large website, sometimes it’s necessary to split an index XML sitemap. A single XML sitemap is limited to 50,000 URLs, so if your website has more than 50,000 posts, for example, you’ll need two separate XML sitemaps for the post URLs, effectively adding a second index XML sitemap. The Yoast SEO plugin sets the limit even lower – at 1.000 URLs – to keep your XML sitemap loading as fast as possible

What websites need an XML sitemap?

Google’s documentation says XML sitemaps are beneficial for “really large websites”, for “websites with large archives”, for “new websites with just a few external links to it” and for “websites which use rich media content”.

Here at Yoast, while we agree that these kinds of websites will definitely benefit the most from having one, we think XML sitemaps are beneficial for every website . Every single website needs Google to be able to easily find the most important pages and to know when they were last updated, which is why this feature is included in the Yoast SEO plugin.

Which pages should be in your XML sitemap?

How do you decide which pages to include in your XML sitemap? Always start by thinking of the relevance of a URL: when a visitor lands on a particular URL, is it a good result? Do you want visitors to land on that URL? If not, it probably shouldn’t be in your XML sitemap. However, if you really don’t want that URL to show up in the search results you’ll need to add a ‘noindex, follow’ tag. Leaving it out of your XML sitemap doesn’t mean Google won’t index the URL. If Google can find it by following links, Google can index the URL.

Example 1: A new blog

Say, for example, you are starting a new blog. You will want Google to find new posts quickly to make sure your target audience can find your blog on Google, so it’s a good idea to create an XML sitemap right from the start. You might create a handful of first posts and categories for them as well as some tags to start with. But there won’t be enough content yet to fill the tag overview pages, making them “thin content” that’s not valuable to visitors – yet. In this case, you should leave the tag’s URLs out of the XML sitemap for now. Set the tag pages to ‘noindex, follow’ because you don’t want people to find them in search results.

Example 2: Media and images

The ‘media’ or ‘image’ XML sitemap is also unnecessary for most websites. This is because your images are probably used within your pages and posts, so will already be included in your ‘post’ or ‘page’ sitemap. So having a separate ‘media’ or ‘image’ XML sitemap would be pointless and we recommend leaving it out of your XML sitemap. The only exception to this is if images are your main business. Photographers, for example, will probably want to show a separate ‘media’ or ‘image’ XML sitemap to Google.

How to make Google find your XML sitemap

If you want Google to find your XML sitemap quicker, you’ll need to add it to your Google Search Console account. In the new Search Console, you can find the sitemaps in the ‘Index’ tab. You’ll immediately see if your XML sitemap is already added to Search Console. If not, you can add your sitemap on top of the page:

Yoast.com’s XML sitemap added to the new Google Search Console

Within the old Google Search Console you can see your sitemaps by navigating to ‘Crawl’ and then clicking on ‘Sitemaps’.  Click on the ‘Add/Test sitemap’ button which you see on the right of the arrow in the image below if you haven’t added your XML sitemap.

Yoast.com’s XML sitemap added to the old Google Search Console

As you can see in the image, adding your XML sitemap can be helpful to check whether all pages in your sitemap really have been indexed by Google. If there is a big difference in the ‘submitted’ and ‘indexed’ number on a particular sitemap, we recommend looking into this further. There could be an error preventing some pages from being indexed or maybe you need more content or links pointing to the content that’s not been indexed yet.

Yoast SEO and XML sitemaps

Because they are so important for your SEO, we’ve added the ability to create your own XML sitemaps in our Yoast SEO plugin. XML sitemaps are available in both the free and premium versions of the plugin.

Yoast SEO creates an XML sitemap for your website automatically. Click on ‘SEO’ in the sidebar of your WordPress install and then select the ‘Features’ tab:

In this screen, you can enable or disable the different XML sitemaps for your website. Also, you can click on the question mark to expand the information and see more possibilities, like checking your XML sitemap in your browser:

You can exclude content types from your XML sitemap in the ‘Search Appearance’ tab. If you select ‘no’ as an answer to ‘show X in the search results?’ then this type of content won’t be included in the XML sitemap.

Read more about excluding content types here.

Check your own XML sitemap!

Now you’ve read the whole post, you know how important it is to have an XML sitemap, because having one can really help your site’s SEO. Google can easily access your most important pages and posts if you add the right URLs to your XML sitemap. Google will also be able to find updated content easily, so they know when a URL needs to be crawled again. Lastly, adding your XML sitemap to Google Search Console helps Google find your sitemap fast and it allows you to check for sitemap errors.

Now go check your own XML sitemap and make sure you’re doing it right!

Read more: WordPress SEO tutorial: definite guide to higher ranking »

The post What is an XML sitemap and why should you have one? appeared first on Yoast.

Read More

The Secret Behind My 1,866,913 Monthly Search Visitors (It’s Not What You Think)

The Secret Behind My 1,866,913 Monthly Search Visitors (It’s Not What You Think)

How many visitors do you think NeilPatel.com generates each month?

Maybe a million… maybe 2 million?

I bet you’re going to guess 1,866,913.

If that’s what you guessed, you are wrong. This blog actually generated 2,530,346 visitors. 1,866,913 is the number that came from search engines.

So, what’s the secret to my ever-growing Google traffic?

Sure, I have optimized my on-page SEO, I’ve built links, written tons of blog post… I’ve done all of the stuff that most of my competition has done. But doing the same stuff as your competition isn’t enough.

My secret sauce is that I optimize for user signals.

Last week, I broke down some of the user signals Google looks at, as well as providing benchmarks to aim for if you don’t want to be penalized by Google.

If you aren’t familiar with user signals, check the article I linked to above.

So, how do you optimize for user signals?

Well, I know everyone has different types of websites, so I thought I would share the process I use to optimize NeilPatel.com.

Are you showing people what they want?

Google Analytics is an amazing tool. I’m so addicted to it that I log in at least 3 or 4 times a day. Heck, I even log in on weekends.

But here’s the thing, it only tells you half the story. It gives you numbers, but it doesn’t help you visualize what people are doing and what they aren’t.

For example, here is what my main blog page looked like according to Crazy Egg:

What’s wrong with the image?

Everyone is going to the blog to learn more about marketing. Above the fold, I have a box that showcases an SEO Analyzer. But there is one big issue: it’s barely clicked compared to the drop-down that lets you filter the blog content.

The SEO Analyzer had 128 clicks versus 359 clicks to the content filtering option.

Because you didn’t care for it as much, I removed it from the main blog page. And now when you head to the blog page you can see the filtering options above the fold.

I am looking to see what you click on and what you don’t. Simple as that.

If I keep showing you something you aren’t clicking on, I am wasting the opportunity to present you with something you do want to see. Which means I either need to adjust it or delete it.

Now, let me show you my current homepage:

What’s wrong?

Go ahead, take a guess…

Well, looking at the image you’ll notice there are tons of hot spots in the footer. That’s where the navigation is. With there being all of the clicks on the navigation, I should consider adding a navigation menu bar in the header.

Are you getting the hang of how to make your website more user-friendly? Well, let’s try another one.

Here’s an element in the sidebar of my blog posts:

That element only has 1 click. That’s terrible considering that the blog post generated 10,016 visits. And to top it off, that click came from a repeat visitor.

My goal is to convert more first-time visitors into leads, which makes up the majority of my visitors, but they are the lowest percentage of my leads.

So, what did I do? I deleted that element and you no longer see it in my sidebar.

Are you optimizing for mobile?

Let’s face it, more people are visiting your site using mobile devices than laptops or traditional computers.

If that’s not the case, it is just a matter of time.

So, have you optimized your site for mobile? And no, I’m not just talking about having a responsive design because everyone is doing that these days.

If you look at the image above, you’ll notice that I removed the image of myself and a few other elements. This helps make the loading experience faster and it helps focus people’s attention on the most important elements.

Similar to the desktop version, my mobile homepage has a 24% conversion rate. When my mobile version included a picture of me above the fold, my conversion rate dropped to 17%… hence there is no picture of me. 😉

Now, I want you to look at the mobile version of my main blog page and compare it to my homepage.

Do you see an issue?

The blog page generates a lot of clicks on the 3 bars at the top… that’s my navigation menu.

My developer accidentally removed that from the mobile homepage. That’s why the contact button in the footer of the homepage gets too many clicks.

Hopefully, that gets fixed in the next day or two as that could be negatively impacting my mobile rankings.

On top of optimizing the mobile experience, you need to ensure your website loads fast. It doesn’t matter if people are using LTE or 4G, sometimes people have terrible reception. And when they do, your website will load slow.

By optimizing it for speed, you’ll reduce the number of people who just bounce away from your site.

If you want a faster load time, follow this.

And don’t just optimize your site for speed once and forget about it. As you make changes to your site, your pagespeed score will drop, which means you’ll have to continually do it.

For example, you’ll notice I have been making a lot of change to NeilPatel.com (at least that is what the heatmaps above show). As I am making those changes, sometimes it affects my pagespeed score negatively. That means I have to go back and optimize my load time again.

A second in load time delay on average will cost you 6.8% of your revenue.

Are you focusing on helping all of your users?

Not every person who visits your website is the same.

For example, a small percentage of the people who visit NeilPatel.com work at large corporations that are publicly traded and are worth billions of dollars.

And a much larger percentage of my visitors own small and medium-sized businesses. These people are trying to figure out how to grow their traffic and revenue without spending an arm and a leg.

And the largest percentage of my visitors don’t have a website and they are trying to figure out how to get started for free.

In a nutshell, I have three groups of people who visit my website. The first group tends to turn into consulting leads for my agency, but they make up the smallest portion of my traffic.

One could say that I should only focus on helping them and ignore everyone else. But I can’t do that for a few reasons…

I started off with having practically no money and people helped me out when I couldn’t afford to pay them. I love paying it forward and helping people who can’t afford my services because I have been there, and I know what it’s like.
If I only focused on the large companies, who would link to my website and promote my content? You can bet that Microsoft isn’t going to link to me on a regular basis. If you want to generate social shares and backlinks you have to focus on the masses.
Little is the new big… if you can please the masses, they will make noise and the big players will eventually hear about you. So, don’t just treat people with deep pockets kindly, treat everyone the same and truly care about your visitors.

Once you figure out the types of people coming to your website (and if you are unsure just survey them), go above and beyond to help them out. Create different experiences for each group.

On NeilPatel.com, I’ve learned that people who work at large corporations are busy and they want to listen to marketing advice on the run. For that reason, I have the Marketing School podcast.

And a lot of beginners wanted me to break down my steps over video, so they can more easily replicate my tactics. For that reason, I create new videos 3 times per week giving marketing and business advice.

Many of you want to attend the conferences that I speak at, but can’t afford to buy a ticket. For those people, I create weekly webinars that are similar to the speeches I give at conferences.

And best of all, I know the majority of you find it hard to follow along with all of these tips as it can be overwhelming. So, I created Ubersuggest to help you out.

In other words, I try to go above and beyond for all of my visitors.

Yes, it is a lot of work, but if you want to dominate an industry it won’t happen overnight. Expect to put in a lot of time and energy.

Are you taking feedback from people?

You are going to get feedback. Whether it is in the form of email or comments, people will give you feedback.

It’s up to you if you want to listen… but if a lot of people are telling you the same thing you should consider it.

For example, I get a ton of comments on YouTube from people asking me to create videos in Hindi.

And…

Now, I am not only working on adding Hindi subtitles to my videos, but I am also working on translating my blog content to Hindi.

I’m not doing these to make more money… I’m not doing this to become popular… I’m just trying to do this to help out more people.

It’s the same reason why I have Spanish, Portuguese, and German versions of this website. I had enough requests where I pulled the trigger even though I am not focusing on generating income in those areas.

But here is the thing that most people don’t tell you about business. If you just focus on helping people and solving their problems, you’ll notice that your income will go up over time.

Businesses make money not because their goal is to make money… they make money because they are solving a problem and helping people out.

Another piece of feedback I have been getting recently is that my blog is too hard to read on mobile devices.

For that reason, I’ve assigned a task to one of my developers to fix this.

Conclusion

Traffic generation is a business. It’s not a hobby. It’s competitive, and it’s difficult to see short-term gains.

If you want to rank at the top of Google, you can’t treat your website as a hobby. You have to treat it like a business.

And similar to any business, you won’t succeed unless you pay attention to the needs of your customers. That means you have to listen to them. Figure out what they want and provide it.

That’s what Google is trying to do. They are trying to rank sites that people love at the top of their search engine. If you want to be one of those sites, then start paying attention to your visitors.

Show them what they want and go above and beyond so that they will fall in love with your website instead of your competition.

If you aren’t sure if you are making the right changes, monitor your brand queries. The more people that are searching for your brand terms on Google is a big leading indicator that people are happy with your website.

Just look at NeilPatel.com: I get over 40,000 visitors a month from people Googling variations of my name:

And I generate over 70,000 visits a month just from people searching for my free tool, Ubersuggest.

That’s how I’m continually able to make my traffic grow.

Yes, I do pay attention to what Google loves, but more importantly, I pay attention to your needs and wants.

Are you going to start optimizing your website for user signals?

The post The Secret Behind My 1,866,913 Monthly Search Visitors (It’s Not What You Think) appeared first on Neil Patel.

Read More