Blog

UX: what is it and why does it matter?

Posted by on Jul 11, 2018 in SEO Articles | Comments Off on UX: what is it and why does it matter?

UX: what is it and why does it matter?

Why are we talking about UX?

Put simply, UX is important in every part of our daily lives, probably more than most of us even notice. Let’s walk through an example.  

The 2000 U.S presidential election was extremely contentious. Here are the results:

Pay attention to the highlighted “Electoral vote” and “Popular vote” statistics. In the United States, the president wins by getting a majority of electoral college votes. Each of the 50 states has a number of electoral college votes and whichever candidate wins your state gets all of the electoral college votes from your state. The election that year came down to the state of Florida and its electoral college votes. George Bush won the state by 537 votes out of almost six million votes. Crazy!

It gets even more interesting. Here is what a ballot looked like in a county in Florida that year:

If you were voting for George Bush, which button would you press? The answer is the button marked with an A. Easy.

What would you press if you were voting for Al Gore? Well, he is second on the list so you would press the second button labeled B. Wrong. Pressing B would send your vote to the Reform candidate Pat Buchanan. If you wanted to vote for Gore, you would need to press the button labeled C. If you are confused, so were many voters in Florida.

The small margin of victory and this confusing ballot lead to a recount of votes and a U.S Supreme Court decision.

This is a clear example of poor UX. The design choices for the ballot lead to confusion and error and impacted the result of a U.S election.

Why does any of this matter for SEO? Let’s talk about it.

How does this matter for SEO?  

In one phrase: Machine Learning.

Machine learning algorithms are taking over SEO. Google still uses traditional SEO signals (links, keywords) but machine learning adds another layer to their algorithm.

Google uses traditional SEO signals to show initial results but then uses machine learning to iterate on those results based on user feedback. If Google displays a page, a user clicks and lands on that page, the user then immediately bounces back to the SERP, Google’s machine learning algorithms will know not to display the result.

When a user does not engage with a page, that sends a very clear signal to Google. This is why UX has become crucial if you want your site to rank.

Is UX hard to measure? It sure is. We at Distilled have been talking about this for years.

Here is an article that explains our approach to quantifying UX and quality signals. Google has human testers who go onto sites and manually rate them on quality and UX. Our survey emulates Google’s human testing and gives us information on UX related issues.

How else can you know that UX is a problem for your site? Think about how many of the following issues your site can relate to:

You have done a full technical audit and there is nothing (or very few) technical issues with your site
You are not hit by any manual penalty
Your site decreased in rank (and traffic) around the same time Google announced quality updates
Your domain authority is relatively high compared to your competitors and your backlink profile is in a good place

If all or most of the above sounds familiar to you and your site does not rank competitively in its space, UX is a huge potential opportunity.

UX has several components and as Google’s algorithms continue to advance, sites who take care to emphasize UX will reap the benefits in the SERPs.

Whether or not you are designing a ballot for a presidential election or making a site to sell t-shirts, UX matters.

Ok, UX is important. I get that. I still don’t know what it is and what I can do?

You

What is UX?

The phrases UX and UX Design get thrown around a lot. Often, if a website or app does not look visually appealing, people say “that site has bad UX.” But what is UX and what does it really mean?

UX is composed of seven key factors:

Useful
Usable  
Findable
Credible
Desirable
Accessible
Valuable

Useful

This is simple. Is your product / website useful? If you have a website, then the question you need to ask yourself is “is my website promoting a product or service people want?”

It is important to note that “useful” is certainly in the eye of the beholder.  Your website can be promoting products or services that provide non-practical benefits such as PPC. What matters most is that your target audience finds it useful.

Usable

Can users utilize your website or product effectively and efficiently? If not, then you may lose out to competitors. In a world where websites are increasing and attention spans are decreasing, if your site is not easy to use, your competitors will reap the benefits.

Findable

Can users find your product? In the case of websites, is the information and content easy to find? This about Wikipedia. As soon as you land on the page, you know exactly where the content is and what to expect. In the vase of a Wikipedia biography, the first sentence usually contains the pronunciation of the persons name. The right corner usually has a box with a picture as well as info on birth dates, education, and profession. It doesn’t matter who the person is, if you go on Wikipedia and look at the biography you will be able to find the information you are looking for.

Credible

“Fool me one time, shame on you, fool me twice can’t put the blame on you” – J.Cole

Web users have no patience for sites that are not credible. For a product, it should do the job but also last a long time. For a website, the information provided should be accurate and fit for whatever brought the user to the page. Even search engines have gotten into the credibility game by delivering benefits to sites that are HTTPS vs HTTP.

Desirable

Do people want your product? Do people brag about using your product or site? Think about cars. A Toyota and a Mercedes are both great cars. If given either for free, which would you choose?

Desirability is all about branding, design and aesthetics. This is not to say that sites that lack in these areas will not perform well. But if a user can access the same information from a more desirable website, they will undoubtedly choose to do so.

Your local newspaper and news outlets such as the New York Times and the Guardian probably cover similar issues when it comes to major world events. Which outlet do you read?  

Accessible

Accessibility often gets overlooked, but it is crucial. Accessible products and sites are those that can be used by an audience of a wide range of abilities.

Accessibility needs can be those with physical or learning impairments. This crucial area of UX gets overlooked due to judgements made that disabled individuals do not make up a big percentage of the market. However, the US census estimates that nearly 20% of Americans have a disability. This number is expected to be even higher in developing nations.

Accessibility is so important that Google has created documentation to help webmasters make their sites more accessible.

Value

Value is what encompasses all of the other principles mentioned. Users will find your product or service valuable if it is useful, usable, findable, credible, desirable, and accessible, then users will see value in your product or site. If your site does not provide value, then it will not get users.

Thanks for taking the time to read. If you have any thoughts or questions feel free to reach out to me in the comments below or via the Distilled Twitter account.

How to Create, Measure and Optimize High-Quality Content – Google-Friendly

Posted by on Jul 11, 2018 in SEO Articles | Comments Off on How to Create, Measure and Optimize High-Quality Content – Google-Friendly

How to Create, Measure and Optimize High-Quality Content – Google-Friendly

Everybody is talking about high-quality content but what does it even mean these days? That’s what we are going to find out today. There’s no content marketing inception. The truth is that high-quality content is contextual and for maximum results, it requires three steps: keyword research, good content writing, and on-page search engine optimization.

 

Let’s find out how to perform top-notch keyword research, how to measure results and optimize pages up front for astonishing on-page SEO, respecting Google’s quality guidelines. It might sound sophisticated but as long as you write unique content, provide valuable content to the user compared to other pages, offer insightful analysis, Google will reward you.

 

 

We have to admit that there is a noisy environment due to the high amount of content on the web. So let’s try to shed some light within the “high-quality content”; what does it mean, how to create it and how to measure it. 

 

What is High-Quality Content
How to Create High-Quality Content

Monitor Topics
Perform Keyword Research
Gather Keyword Data From Previous Paid Search Campaigns (optional)
Spy on Your Competitors

How to Optimize Your Content Following the Google-Approved Way

Make a Content Plan
Follow Successful Content Stories for Insights
Use the Terminology You Collected in Keyword Research Phase

Three Methods to Measure High-Quality Content

Google Search Console
Rank Tracking
Google Analytics

 

I have a saying that I go by in the whole process of creating high-quality content: don’t be a dwarf against giants, but rather the peak they are trying to reach.

 
1. What is High-Quality Content

 

Google explains very well what is high-quality content and how you can achieve it:

 

If your pages contain useful information, their content will attract many visitors and entice webmasters to link to your site. In creating a helpful, information-rich site, write pages that clearly and accurately describe your topic.

Google

 

 

Google Panda algorithm was designed to distinguish and reward the high-quality content from all the worthless one. In order to have good results, content should first answer some questions regarding its quality, such as:

 

Would you trust the information presented in this article?
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
Does the article provide original content or information, original reporting, original research, or original analysis?
Does this article have spelling, stylistic, or factual errors?
Does this article provide a complete or comprehensive description of the topic?
Does this article contain insightful analysis or interesting information that is beyond obvious?

 

You can find lots of other questions you should ask yourself on Google Webmaster Central Blog

 

After all the algorithm updates that passed by us, being witness to tons of (sometimes useless) content written every day, we got to the point where we can easily spot the quality of the ordinary in the web. As you can see, everybody is talking about quality and not quantity, but few of them really know what makes a piece of content qualitative.

 

If you’re starting to use the quality content terminology for everything that you think “it is a great idea”, then you might not distinguish the good from the bad. 

Quality content is defined by success. It is defined by a goal and it must bring good metrics/results.

Performance is what really matters. Avoid tricks intended to improve your search rankings and “get you on the first page in one week”.

 

It does matter how much time you spend creating the content or if you think it might be a good idea. You must understand that your audience has the final say.

 

Your content must be lovable.

 

That’s right. You must attract the audience. Context is a great help for getting lovable content. We can define quality content by that type of content that fulfills the needs of the user. Especially now, when personalized content is on the rise and native advertising is a content marketing trend.

 

But what exactly does context mean?

 

Google developed patents for discovering the contextual content through topical searches to offer valuable information and increase the user experience. Actually, through context Google wants to recognize the user intent.

 

If you want to have contextual content you must think very well at your topic and point out all the answers. 

 

If you’re searching for “Batman” in October, most likely you’ll be shown websites selling Halloween costumes. If you’re searching for “Batman” while you are in Turkey, you will probably be redirected to info related to the city of Batman.

 

Contextual content relates to the location as well. You have to know your niche very well in order to provide value through your content and appeal to them.

 

Your audience gives value to your content.

 

You’ve heard it before: if your audience loves the content, Google will definitely love it too. No more shenanigans, no more shortcuts or shitty content. Talk from your heart, in your own words, from your experience. And give wisdom and pieces of advice.

 

Lots of SEO specialists say that the era of “content is king” has ended, and made room for the saying “context is king”.

 

Evergreen content can be an example of quality content. Is that type of content that will always be “available”, accurate. It will teach your audience. It passes the test of time, being relevant at every hour of the day. It won’t bring you spikes in traffic, it will sustain it.

 

Imagine if you can offer a quality answer to a question that is timeless. You’ll be sitting on a gold mine. Original content is considered to have quality. Zach Bulygo agrees.

Original also means originality. Your ideas should be original! Rehashing the same concepts or other posts over and over again is not original. If your content is played out, no one will link to it – and that defeats the purpose of writing content in the first place.

 Zach Bulygo

Content writer at Kissmetrics / @zachcb1

 
2. How to Create High-Quality Content

 

Quality content doesn’t come very often and it is not something you can achieve fast. Nobody can predict with accuracy if your articles will be successful, but there are some things you could do to influence the outcome.

 
Step 1: Monitor Topics

 

You could find yourself in one of these two situations: either you have already written some articles or you are starting fresh. In case you are finding yourself in the first situation, you have the advantage of testing the market and having a clue about your audience. In case you are on the second situation, then it’s our time to shine, see what your competitors are doing, know your product, and start to write content, test and write again until you find what is working for you. 

 

If you are already on the “content market”, first you need to check the data of your previously published articles to see which articles worked best (with high numbers in traffic and higher rankings).

 

Larry Kim explains it better:

 

You need the quantity to find the quality.

Larry Kim

Founder of WordStream

 
Step 2: Perform Keyword Research

 

Once you know what topic works for you, you can search for keywords and try to find out the lexical field to know all the terminology. Keyword research is mandatory in this phase. There are lots of tools that can easily do the job for you. Keyword Tool is an example that is very easy and fast to use. In a quick search, it reveals thousands of related topics & keywords opportunities.

 

 

There are some things you should know once you get here. Look at the volume and keyword difficulty. Look for keywords with high search volume. If the difficulty score is around 50, that means the competition for that keyword is medium. 

 

In the screenshot above you can see what search results for “protein shakes”. You can find new topic ideas, search for specific keywords depending on which user intent you are tracking.  

 

 

You search for questions in case the user wants to learn how to do something (for “how-tos” articles); for focus keywords in case you want to cover a more comprehensive blog post; phrase match for specific situations and so on.

 

Keyword research will never be out of style. It will always be in trend. It’s a must. But another must is knowing to be selective and wise.

 

Keyword Tool can work very well with other tricks I am following myself. Google is a bundle of information. You can search for the specific query in the search bar and see what other people are searching for with the autocomplete feature.

 

 

Since Google becomes more contextual, it is best if you try and use this practice. Results might differ from one location to another. Also, on Google’s bottom of the page, you can see searches related to your query.

 

 

 

Performing these quick searches might help you get a better understanding of what do people are looking for, and also what type of information you can find on the first page. Do you differentiate yourself from the competition? Are you providing added value? Will your piece of content be unique? If your answer is yes to all thse questions, then you’re on the right track.

 

There are also free tools that can strengthen your keyword research:

Google Adwords (there is a free option);
Kparser;
KWFinder;
Answer The Public.

 
Step 3: Gather Keyword Data From Previous Paid Search Campaigns (optional)

 

Google Adwords offers historical information of your data you can use for driving SEO traffic. For example, you can see the time frame when your campaigns are performing best, which keywords work best, and how much traffic you’re bringing to your website, and you can optimize those metrics.

 

 

You can receive information for monthly searches and see exactly which month has more or lower searches. That is influenced if you are searching for a query for seasonal content. The available data for the free account are average monthly searches, competition level, lowest and highest bid. 

 

When you finished with the keyword research, you should focus your attention on the competition. We mentioned it before and it is a step you shouldn’t skip. Look at the first 10 positions in Google or Bing to see what topics were debated, what is already written, what’s missing with the purpose of highlighting your added value.

 
Step 4: Spy on Your Competitors

 

In terms of content, there are a lot of things to take into consideration and that’s why tools have high benefits in the content creation process. For example, Keyword Tool will show a list of all the pages with some extra information that is hard to collect by hand.

 

In the screenshot bellow, it is highlighted the data gathered for each page:

(1) the content score, which is calculated based on the keyword pattern form all the pages that rank for the specific search queries.
(2) the number of focus keywords used out of the total of keywords used to calculate the content score.
(3) the readability score calculated using the Flesch – Kincaid readability scale which indicates how complicated a piece of text is to understand. 
(4) the number of keywords on the page.
(5) the list of keywords used to calculate the Content Performance score for that particular web page.

 

 

Having an idea of your competition is a good insight for you to cover-up what’s missing and improve search engine content discovery offering the best piece of content. With such a rich handful of data, you’re prepared to go to the next step: optimization.

 
3. How to Optimize Your Content Following the Google-Approved Way

 

If we are talking about optimization you must understand that natural language paired with traditional on-site SEO techniques are the key to success. Highly readable pages are the winner in the whole content-discovery adventure.

 
Step 1: Make a Content Plan

 

Google is evaluating multiples factors using various quality signals to see if your content is relevant to a specific query. At this point, there is nothing tricky, just a lot of math and natural language processing. To save time and effort, Content Assistant will help you identify the exact keywords your content is missing to rank higher.

 

 

The mechanism is simple and fast:

paste all the content in the Content Assistant to start analyzing it;
add keyword suggestion to increase your content performance score;
rewrite the underperforming content and add new content, if needed, to make the keywords inclusion more logical and natural.

 

Of course, the content performance score is not the only factor that matters when Google ranks a webpage but, it can give you an idea of the position you can achieve. Link metrics are important as well.

 
Step 2: Follow Successful Optimization Stories for Insights

 

I’ve written about content optimization before, even cited a few success stories, from Jason Acidre, co-founder at Xight Interactive, Greenlane Agency, and lots of others, and they all had one thing in common: knowing their market and adding content naturally – in context, I might add. No spamming. No duplicate content.  

 

The story and the idea remain the most interesting part.

 

Getting the public to like you isn’t an easy job. First, you get into their sight, and then make them fall in love with your content. Even if these two seem to be two separated steps, their work together. Here’s the content strategy I follow and try to stick to each article:

 

Find a topic: usually from Social Media comments on your page, from our customers on support, from blog comments, different talks, news, newsletter, trends and so on.
Start pulling out some notes so you have the whole idea.
Organize the notes into a structure.
Start documenting and writing: Make sure you have a catchy introduction to appeal to your audience.
Craft an eye-candy headline that offers benefits to the audience and has a strong call to action.
Perform on-site optimization: Make sure you have optimized title tags, meta descriptions, images, URLs with the keywords you had chosen.
Optimize the content using the Content Assistant tool for 2-3 keywords (max).
Promote the content: newsletter, social media, content syndication and so on.

 
Step 3: Use the Terminology You Collected in Keyword Research Phase

 

In the optimization phase, you must follow a natural path to use focus keywords. As I mentioned before, you must think at the context, and that translates into adding keywords that are relevant to your focus keyword in the article.

 

If you use a keyword research tool you’ll get a full list of recommendations, that is very good to add them in your content. If you are using Content Assistant, then it will be easier for you to select from auto-generated keywords list, getting insights into how well you will rank in Google.

 

 

Start with “keywords you should use” and focus on the ones that have a bold font and a dot in front of them, then circle around the rest from that list. Once you finished adding all the relevant keywords, move the “keywords you should use more often” by following the same procedure.

 

Once you finished this step, you can promote your blog post and keep track of the outcome.

 
4. Three Methods to Measure the Effects of Your Article

 

Measuring the results is a step that can’t be avoided. You should keep track of the keyword you optimized the content to see the evolution.

 
Method 1: Google Search Console

 

GSC or Google Webmaster tools is a good support for this. Go to your account » Search Traffic » Search Analytics and track individual pages.

 

 

As you can see in the next screenshot, you have data on the number of impressions, average CTR, and average position. Search for your page to see how well is ranking: Pages » Filter Pages and paste the URL.

 

 
Method 2: Rank Tracking

 

You can also use the Rank Tracking tool to see the whole list of keywords at once and follow the historical trending line. You have to add the keywords and after that, you’ll have to wait to see how it is evolving day by day.

 

 

For a massive content optimization, you should look at the search visibility to see if you’re on an ascendant line or not. The search visibility shows you how your websites ranks overall for all the possible keyword combination that you might be or might not be aware of. Below you can see an example:

 

 
Method 3: Google Analytics

 

Google Analytics is another provider that can offer qualitative data, most of the times. I’ve conducted an informative guide some time ago on how to improve search engines rankings using Google Analytics data which I recommend you reading it. It is a good starter for understanding your audience: likes, interests, behavior, demographics. You’ll find out how to improve your conversion rates and see which type of content brings more traffic, from what sources and other technical information.

 

All the data you get will help you have a better content management.

 

Conclusion

 

Writing high-quality content isn’t so hard, but it isn’t piece of cake either. It requires knowledge, desire, strategical thinking and tools to ease up the work. If you understand what quality content and relevant content means, then you’re two steps forward.

 

You need an idea and then craft unique content around it to differentiate yourself from the audience. On-site optimization is the next step. Use the right keyword to create context and highlight the quality of your content. In the end, promote it and track the results.

 

Analytical data offers valuable insights into your content, audience, and business in general. It can bring a lot of benefits to start mushrooming your inbound marketing strategies and outperform your actual content marketing campaign. All these steps will help you fulfill your goals so best of luck in using them!

 

The post How to Create, Measure and Optimize High-Quality Content – Google-Friendly appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

Blurring the Line Between CDN and CMS

Posted by on Jul 11, 2018 in SEO Articles | Comments Off on Blurring the Line Between CDN and CMS

Blurring the Line Between CDN and CMS

Cloudflare recently announced that they’re launching a new feature, called “Cloudflare Workers”. It provides the ability for anybody who’s using Cloudflare as a CDN to write arbitrary JavaScript (based on the standard Service Worker API), which runs on Cloudflare’s edge nodes.

In plain English, you’ll be able to write code which changes the content, headers, look, feel and behaviour of your pages via the Cloudflare CDN. You can do this without making development changes on your servers, and without having to integrate into existing site logic.

If you’re familiar with JavaScript, you can just log into Cloudflare, and start writing logic which runs on top of your server output.

Why is this helpful?

As SEOs, we frequently work with sites which need technical improvements or changes. But development queues are often slow, resources restricted, and website platforms complex to change. It’s hard to get things changed or added.

So many of us have grown comfortable with using workarounds like Google Tag Manager to implement SEO changes – like fixing broken canonical URL tags, or adding robots directives to pages – and hoping that Google respects or understand the conflicting signals we send when we mix on-page and JavaScript-based rules.

But whilst Google professes to be capable of crawling, indexing and understanding JavaScript content and websites, all of the research suggests that they get it wrong as often as they get it right.

Cloudflare’s announcement is significant because, unlike tag management platforms, the alterations are made server-side, before the page is sent to the user – Google only sees the final, altered code and content. There’s no messy JavaScript in the browser, no cloaking, and no conflicting logic.

Service workers on the edge

Cloudflare, like other CDNs, has servers all over the world. When users request a URL on your website, they’re automatically routed to the nearest geographic ‘edge node’, so that users access the site via a fast, local connection. This is pretty standard stuff.

What’s new, however, is that you can now write code which runs at those edge nodes, which allows fine-grained control over how the page is presented to the end user based on their location, or using any logic you care to specify.

With full control over the response from the CDN, it’s possible to write scripts which change title tags, alter canonical URLs, redirect the user, change HTTP headers, or which add completely new functionality; you can adapt, change, delete, build upon or build around anything in the content which is returned from the server.

It’s worth noting that other platforms, like AWS, already launched something like this in July 2017. The concept of making changes at the edge isn’t completely new, but AWS uses a different approach and technology stack.

Specifically, AWS requires users to write functions in Node.js (a common server-side JavaScript framework), using a specific and proprietary approach to how requests/responses are handled. This comes with some advantages (like being able to use some Node.js libraries) but locks you into a very specific approach.

Cloudflare’s solution is based on the Service Worker API (as opposed to Node.js), which might look like a more future-proof approach.

Service workers are the current framework of choice for progressive web apps (PWAs), managing structured markup, and playing with new/emerging formats as Google (and the wider web) moves from favouring traditional websites to embracing more app-like experiences. That makes it a good skill set to learn, to use, and potentially to recycle existing code and solutions from elsewhere in your ecosystem.

That PWAs look likely to be the next (arguably, the current) big thing means that service workers aren’t going anywhere anytime soon, but Node.js might just be the current flavour of the month.

Getting hands-on

Cloudflare provides a sandbox for you to test and visualise changes on any website, though it’s unclear whether this is part of their launch marketing or something which will be around for the long-term (or a component of the editor/deployment system itself).

That’s a lot of power to play with, and I was keen to explore what it looks like in practice.

It took me just a few minutes to modify one of the scripts on their announcement page to add the word ‘awesome’ (in a pleasing shade of orange) to Distilled’s homepage. You can check out the code here.

Whilst this is hugely powerful, it doesn’t come without risks and drawbacks. For a start, you’ll need to have some sharp JavaScript skills to write any rules, and you’re going to have to do it without any external supporting libraries of frameworks (like jQuery).

Service workers can be complex to work with, too. For example, all of your changes are asynchronous; they all run in parallel, at the same time. That makes things lightning fast, but it means that some complex logic which relies on specific ordering or dependencies might be challenging to write and maintain.

And with all of this, there’s also no nice WYSIWYG interface, guides or tutorials (other than general JS or service worker questions on StackOverflow). You’ll be flying by the seat of your pants, spending most of your time trying to work out why your code doesn’t work. And if you need to turn to your developers for help, you’re back at our initial problem – they’re busy, they have other priorities, and you’re fighting for resources.

A meta CMS is not a toy

As we increasingly find ourselves turning to workarounds for long development cycles, issues which “can’t be fixed”, and resolving technical challenges, it’s tempting to see solutions like Google Tag Manager and Cloudflare Workers as viable solutions.

If we can’t get the thing fixed, we can patch over it with a temporary solution which we can deploy ‘higher up the stack’ (a level ‘above’/before the CMS), and perhaps reprioritise and revisit the actual problem at a later date.

You can fix your broken redirects. You can migrate to HTTPS and HTTP/2. You can work through all those minor template errors which the development team will never get to.

But as this way of working becomes habit, it’s not unusual to find that the solutions we’re using (whether it’s Google Tag Manager, Cloudflare, or our own ODN) take on the characteristics of ‘Meta CMSs’; systems which increasingly override our templates, content and page logic, and which use CMS-like logic to determine what the end user sees.

Over time, we build up more and more rules and replacement, until we find that there’s a blurring of lines between which bits of our website and content we manage in each platform.

This creates a bunch of risks and challenges, such as:

What happens when the underlying code changes, or when rules conflict?
If you’re using a tag manager or CDN to layer changes ‘on top’ of HTML code and pages, what happens when developers make changes to the underlying site logic?

More often than not, the rules you’ve defined to layer your changes break, with potentially disastrous consequences. And when you’ve multiple rules with conflicting directives, how do you manage which ones win?

How do you know what does what?
Writing rules in raw JavaScript doesn’t make for easily readable, at-a-glance understanding of what’s being altered.

When you’ve got lots of rules or particularly complex scripts, you’ll need a logging or documentation process to provide human-friendly overviews of how all of the moving parts work and interact.

Who logs what’s where?
If conflicts arise, or if you want to update or make new changes you’ll need to edit or build on top of your existing systems. But how do you know which systems – your CMS or your meta CMS – are controlling which bits of the templates, content and pages you want to modify?

You’ve got rules and logic in multiple places, and it’s a headache keeping track.

When the CEO asks why the page he’s looking at is broken, how do you begin to work out why, and where, things have gone wrong?

How do you do QA and testing?
Unless your systems provide an easy way to preview changes, and allow you to expose testing URLs for the purposes of QA, browser testing and similar, you’ve got a system with a lot of power and very little quality control. At the moment, it doesn’t look like Cloudflare supports this.

How do you manage access and versioning?
As your rules change, evolve and layer over time, you’ll need a way of managing version control, change logging, and access/permissions. It’s unclear if, or how Cloudflare will attack this at the moment, but the rest of their ecosystem is generally lacking in this regard.

How do you prevent accidental exposure/caching/PII etc?
When you’ve full access to every piece of data flowing to or from the server, you can very easily do things which you probably shouldn’t – even accidentally. It doesn’t take much to accidentally store, save, or expose private user information, credit card transaction details, and other sensitive content.

With great power comes great responsibility, and just writing-some-javascript can have unintended consequences.

In general then, relying overly on your CDN as a meta CMS feels like a risky solution. It’s good for patching over problems, but it’s going to cause operational and organisational headaches.

That’s not to say that it’s not a useful tool, though. If you’re already on Cloudflare, and you have complex challenges which you can resolve as a one-off fix using Cloudflare Workers, then it’s a great way to bypass the issue and get some easy wins.

Alternatively, if you need to execute geographically specific content, caching or redirect logic (at the closest local edge node to the user), then this is a really great tool – there are definitely use cases around geographically/legally restricted content where this is the perfect tool for the job.

Otherwise, it feels like trying to fix the problem is almost always going to be the better solution. Even if your developers are slow, you’re better off addressing the underlying issues at their source than patching on layers of (potentially unstable) fixes over the top.

Sometimes, Cloudflare Workers will be an elegant solution – more often than not, you should try to fix things the old-fashioned way.

ODN as a meta CMS

Except, there may be an exception to the rule.

If you could have all of the advantages of a meta CMS, but with provisions for avoiding all of the pitfalls I’ve identified – access and version control, intuitive interfaces, secure testing processes, and documentation – you could solve all of your technical SEO challenges overnight, and they’d stay solved.

And whilst I want to stress that I’m not a sales guy, we have a solution.

Our ‘Optimisation Delivery Network’ product (Distilled ODN for short) does all of this, with none of the disadvantages we’ve explored.

We built, and market our platform as an SEO split-testing solution (and it’s a uniquely awesome way to measure the effectiveness of on-page SEO changes at scale), but more interestingly for us, it’s essentially a grown-up meta CMS.

It works by making structured changes to pages, between the request to the server and the point where the page is delivered back to the user. It can do everything that Google Tag Manager or Cloudflare can do to your pages, headers, content and response behaviour.

And it has a friendly user interface. It’s enterprise-grade, it’s scalable, safe, and answers to all of the other challenges we’ve explored.

We have clients who rely on ODN for A/B testing their organic search traffic and pages, but many of these also use the platform to just fix stuff. Their marketing teams can log in, define rules and conditions, and fix issues which it’d typically take months (sometimes years) for development teams to address.

So whilst ODN still isn’t a perfect fix – if you’re in need of a meta CMS then something has already gone wrong upstream – it’s at least a viable, mature and sophisticated way of bypassing clunky development processes and delivering quick, tactical wins.

I expect we’ll see much more movement in the meta CMS market in the next year or so, especially as there are now multiple players in the space (including Amazon!); but how viable their products will be – if they don’t have usable interfaces and account for organisational/operational challenges – is yet to be seen.

In the meantime, you should have a play with Cloudflare’s sandbox, and if you want more firepower and a stronger safety net, get in touch with us for a Distilled ODN demo.

Writing Content That Is Too In-Depth Is Like Throwing Money Out the Window

Posted by on Jul 11, 2018 in SEO Articles | Comments Off on Writing Content That Is Too In-Depth Is Like Throwing Money Out the Window

Writing Content That Is Too In-Depth Is Like Throwing Money Out the Window

You’ve heard people telling you that you need to write in-depth content because that’s what Google wants.

And it’s true… the average page that ranks on page 1 of Google contains 1,890 words.

But you already know that.

The question is, should you be writing 2,000-word articles? 5,000? Or maybe even go crazy and create ultimate guides that are 30,000 words?

What’s funny is, I have done it all.

I’ve even tested out adding custom images and illustrations to these in-depth articles to see if that helps.

And of course, I tested if having one super long page with tens of thousands of words or having multiple pages with 4,000 or 5,000 words is better.

So, what do you think? How in-depth should your content be?

Well, let’s first look at my first marketing blog, Quick Sprout.

Short articles don’t rank well

With Quick Sprout, it started off just like any normal blog.

I would write 500 to 1,000-word blog posts and Google loved me.

Just look at my traffic during January 2011.

As you can see, I had a whopping 67,038 unique visitors. That’s not too bad.

Even with the content being short, it did fairly well on Google over the years.

But over time, more marketing blogs started to pop up, competition increased, and I had no choice but to write more detailed content.

I started writing posts that were anywhere from 1,000 to a few thousand words. When I started to do that, I was able to rapidly grow my traffic from 67,038 to 115,759 in one year.

That’s a 72.67% increase in traffic in just 1 year.

It was one of my best years, and all I had to do was write longer content.

So naturally, I kept up with the trend and continually focused on longer content.

But as the competition kept increasing, my traffic started to stagnate, even though I was producing in-depth content.

Here are my traffic stats for November 2012 on Quick Sprout.

I understand that Thanksgiving takes place in November, hence traffic wasn’t as high as it could be. But still, there really wasn’t any growth from January to November of 2012.

In other words, writing in-depth content that was a few thousand words max wasn’t working out.

So what next?

Well, my traffic had plateaued. I had to figure something else out.

Writing longer, more in-depth content had helped me before… so I thought, why not try the 10x formula.

I decided to create content 10 times longer, better, and more in-depth than everyone else. I was going to the extreme because I knew it would reduce the chance of others copying me.

Plus, I was hoping that you would love it as a reader.

So, on January 24, 2013, I released my first in-depth guide.

It was called The Advanced Guide to SEO.

It was so in-depth that it could have been a book.

Literally!

Heck, some say it was even better than a book as I paid someone for custom illustration work.

Now let’s look at the traffic stats for January 2013 when I published the guide.

As you can see my traffic really started to climb again.

I went from 112,681 visitors in November to 244,923 visitors in January. Within 2 months I grew my traffic by 117%.

That’s crazy!!!!

The only difference: I was creating content that was so in-depth that no one else dared to copy to me (at that time).

Sure, some tried and a few were able to create some great content, but it wasn’t like hundreds of competing in-depth guides were coming out each year. Not even close!

Now, when I published the guide I broke it down into multiple chapters like a book because when I tested out making it one long page, it loaded so slow that the user experience was terrible.

Nonetheless, the strategy was effective.

So what did I do next?

I created 12 in-depth guides

I partnered up with other marketers and created over 280,000 words of marketing content. I picked every major subject… from online marketing to landing pages to growth hacking.

I did whatever I could to generate the most traffic within the digital marketing space.

It took a lot of time and money to create all 12 of these guides, but it was worth it.

By January of 2014, my traffic had reached all-time highs.

I was generating 378,434 visitors a month. That’s a lot for a personal blog on marketing.

Heck, that’s a lot for any blog.

In other words, writing 10x content that was super in-depth worked really well. Even when I stopped producing guides, my traffic, continually rose.

Here’s my traffic in January 2015:

And here’s January 2016 for Quick Sprout:

But over time something happened. My traffic didn’t keep growing. And it didn’t stay flat either… it started to drop.

In 2017, my traffic dropped for the first time.

It went from 518,068 monthly visitors to 451,485. It wasn’t a huge drop, but it was a drop.

And in 2018 my traffic dropped even more:

I saw a huge drop in 2018. Traffic went down to just 297,251 monthly visitors.

And sure, part of that is because I shifted my focus to NeilPatel.com, which has become the main place I blog now.

But it’s largely that I learned something new when building up NeilPatel.com.

Longer isn’t always better

Similar to Quick Sprout, I have in-depth guides on NeilPatel.com.

I have guides on online marketing, SEO, Google ads, Facebook ads, and the list goes on and on.

If you happened to click on any of the guides above you’ll notice that they are drastically different than the ones on Quick Sprout.

Here are the main differences:

No fancy design – I found with the Quick Sprout experience, people love the fancy designs, but over time content gets old and outdated. To update content when there are so many custom illustrations is tough, which means you probably won’t update it as often as you should. This causes traffic to go down over time because people want to read up-to-date and relevant information.
Shorter and to the point – I’ve found that you don’t need super in-depth content. The guides on NeilPatel.com rank in similar positions on Google and cap out at around 10,000 words. They are still in-depth, but I found that after 10,000 or so words there are diminishing returns.

Now let’s look at the stats.

Here’s the traffic to the advanced SEO guide on Quick Sprout over the last 30 days:

Over 7,842 unique pageviews. There are tons of chapters and as you can see people are going through all of them.

And now let’s look at the NeilPatel.com SEO guide:

I spent a lot less time, energy, and money creating the guide on NeilPatel.com, yet it receives 17,442 unique pageviews per month, which is more than the Quick Sprout guide. That’s a 122% difference!

But how is that possible?

I know what you are thinking. Google wants people to create higher quality content that benefits people.

So how is it that the NeilPatel.com one ranks higher.

Is it because of backlinks?

Well, the guide on Quick Sprout has 850 referring domains:

And the NeilPatel.com has 831 referring domains:

Plus, they have similar URL ratings and domain ratings according to Ahrefs so that can’t be it.

So, what gives?

Google is a machine. It doesn’t think with emotions, it uses logic. While we as a user look at the guide on Quick Sprout and think that it looks better and is more in-depth, Google focuses on the facts.

See, Google doesn’t determine if one article is better than another by asking people for their opinion. Instead, they look at the data.

For example, they can look at the following metrics:

Time on site – which content piece has a better time on site?
Bounce rate – which content piece has the lowest bounce rate?
Back button – does the article solve all of the visitors’ questions and concerns? So much so they visitor doesn’t have to hit the back button and go back to Google to find another web page?

And those are just a few things that Google looks at from their 200+ ranking factors.

Because of this, I took a different approach to NeilPatel.com, which is why my traffic has continually gone up over time.

Instead of using opinion and spending tons of energy creating content that I think is amazing, I decided to let Google guide me.

With NeilPatel.com, my articles range from 2,000 to 3,000 words. I’ve tried articles with 5,000+ words, but there is no guarantee that the more in-depth content will generate more traffic or that users will love it.

Now to clarify, I’m not trying to be lazy.

Instead, I’m trying to create amazing content while being short and to the point. I want to be efficient with both my time and your time while still delivering immense value.

Here’s the process I use to ensure I am not writing tons of content that people don’t want to read.

Be data driven

Because there is no guarantee that an article or blog post will do well, I focus on writing amazing content that is 2,000 to 3,000-words long.

I stick within that region because it is short enough where you will read it and long enough that I can go in-depth enough to provide value.

Once I release a handful of articles, I then look to see which ones you prefer based on social shares and search traffic.

Now that I have a list of articles that are doing somewhat well, I log into Google Search Console and find those URLs.

You can find a list of URLs within Google Search Console by clicking on “Search Traffic” and then “Search Analytics”.

You’ll see a screen load that looks something like this:

From there you’ll want to click on the “pages” button. You should be looking at a screen that looks similar to this:

Find the pages that are gaining traction based on total search traffic and social shares and then click on them (you can input URLs into Shared Count to find out social sharing data).

Once you click on the URL, you’ll want to select the “Queries” icon to see which search terms people are finding that article from.

Now go back to your article and make it more in-depth.

And when I say in-depth, I am not talking about word count like I used to focus on at Quick Sprout.

Instead, I am talking depth… did the article cover everything that the user was looking for?

If you can cover everything in 3,000 words then you are good. If not, you’ll have to make it longer.

The way you do this is by seeing which search queries people are using to find your articles (like in the screenshot above). Keep in mind that people aren’t searching Google in a deliberate effort to land on your site… people use Google because they are looking for a solution to their problem.

Think of those queries that Google Search Console is showing you as “questions” people have.

If your article is in-depth enough to answer all of those questions, then you have done a good job.

If not, you’ll have to go more in-depth.

In essence, you are adding more words to your article, but you aren’t adding fluff.

You’re not keyword stuffing either. You are simply making sure to cover all aspects of the subject within your article.

This is how you write in-depth articles and not waste your time (or money) on word count.

And that’s how I grew NeilPatel.com without writing too many unnecessary words.

Conclusion

If you are writing 10,000-word articles you are wasting your time. Heck, even articles over 5,000 words could be wasting your time if you are only going after as many words as possible and adding tons of fluff along the way.

You don’t know what people want to read. You’re just taking a guess.

The best approach is to write content that is amazing and within the 2,000-word to 3,000-word range, assuming you’re in a competitive industry. If your industry isn’t as competitive (and it lacks content online) then you can get away with posts under 1,000 words.

Once you publish the content, give it a few months and then look at search traffic as well as social sharing data to see what people love.

Take those articles and invest more resources into making them better and ultimately more in-depth (in terms of quality and information, not word count).

The last thing you want to do is write in-depth articles on subjects that very few people care about.

Just look at the Advanced Guide to SEO on Quick Sprout… I made an obvious mistake. I made it super in-depth on “advanced SEO”. But when you search Google for the term “SEO” and you scroll to the bottom to see related queries you see this…

People are looking for the basics of SEO, not advanced SEO information.

If I wrote a 2,000-word blog post instead of a 20,000-word guide, I could have caught this early on and adapted the article more to what people want versus what I thought they wanted.

That’s a major difference.

So how in-depth are you going to make your content?

The post Writing Content That Is Too In-Depth Is Like Throwing Money Out the Window appeared first on Neil Patel.

The SEO Apprentice’s Toolbox: Gearing Up for Analysis

Posted by on Jul 11, 2018 in SEO Articles | Comments Off on The SEO Apprentice’s Toolbox: Gearing Up for Analysis

The SEO Apprentice’s Toolbox: Gearing Up for Analysis

Being new to SEO is tricky. As a niche market within a niche market there many tools and resources unfamiliar to most new professionals. And with so much to learn it is nearly impossible to start real client work without first dedicating six months exclusively to industry training. Well…that’s how it may seem at first.

While it may be intimidating, investigating real-world problems is the best way to learn SEO. It exposes you to industry terminology, introduces you to valuable resources and gets you asking the right questions.

As a fairly new Analyst at Distilled, I know from experience how difficult it can be to get started. So here’s a list of common SEO analyses and supporting tools that may help you get off on the right foot.

Reviewing on-page elements

Page elements are essential building blocks of any web page. And pages with missing or incorrect elements risk not being eligible for search traffic. So checking these is necessary for identifying optimization opportunities and tracking changes. You can always go to the HTML source code and manually identify these problems yourself, but if you’re interested in saving a bit of time and hassle, Ayima’s Google Chrome extension Page Insights is a great resource.

This neat little tool identifies on-page problems by analyzing 24 common on-page issues for the current URL and comparing them against a set of rules and parameters. It then provides a list of all issues found, grouped into four priority levels: Errors, Warnings, Notices and Page Info. Descending from most to least severe, the first 3 categories (Errors, Warnings & Notices) identify all issues that could impact organic traffic for the page in question. The last category (Page Info) provides exact information about certain elements of the page.

For every page you visit Page Insights will give a warning next to its icon, indicating how many vulnerabilities were found on the page.

Clicking on the icon gives you a drop-down listing the vulnerabilities and page information found.

What makes this tool so useful is that it also provides details about each issue, like how it can cause harm to the page and correction opportunities. In this example, we can see that this web page is missing an H1 tag, but in this case, could be corrected by adding anH1 tag around the page’s current heading (which is not coded as an H1).

In a practical setting, Page Insights is great for quickly identify common on-page issues that should be fixed to ensure best SEO practice.

Additional tools for reviewing on-page elements:

Varvy (SEO feature)

Moz Bar

Supplemental readings:

12 Essential On-Page SEO Elements for Beginners

The Best On-Page SEO Tool in the Business Now Has Unlimited Access via MozBar

Title Tag Length Guidelines: 2016 Edition

Analyzing page performance

Measuring the load functionality and speed of a page is an important and common practice since both metrics are correlated with user experience and are highly valued by search engines. There are a handful of tools that are applicable to this task but because of its large quantity of included metrics, I recommend using WebPagetest.org.

Emulating various browsers, this site allows users to measure the performance of a web page from different locations. After sending a real-time page request, WebPagetest provides a sample of three tests containing request details, such as the complete load time, the load time breakdown of all page content, and a final image of the rendered page. There are various configuration settings and report types within this tool, but for most analyses, I have found that running a simple test and focusing on the metrics presented in the Performance Results supply ample information.

There are several metrics presented in this report, but data provided in Load Time and First Byte work great for most checks. Factoring in Google’s suggestion to have desktop load time no greater than 2 seconds and a time to first byte of 200ms or less, we can gauge whether or not a page’s speed is properly optimized.

Prioritizing page speed performance areas

Knowing if a page needs to improve its performance speed is important, but without knowing what areas need improving you can’t begin to make proper corrections. Using WebPagetest in tandem with Google’s PageSpeed Insights is a great solution for filling in this gap.

Free for use, this tool measures a page’s desktop and mobile performance to evaluate whether it has applied common performance best practices. Scored on a scale of 0-100 a page’s performance can fall into one of three categories: Good, Needs Work or Poor. However, the key feature of this tool, which makes it so useful for page speed performance analysis, is its optimization list.

Located below the review score, this list highlights details related to possible optimization areas and good optimization practices currently in place on the page. By clicking the “Show how to fix” drop down for each suggestion you will see information related to the type of optimization found, why to implement changes and specific elements to correct.

In the image above, for example, compressing two images to reduce the amount bytes that need to be loaded can improve this web page’s speed. By making this change the page could expect a reduction in image byte size by 28%.

Using WebPagetest and PageSpeed Insights together can give you a comprehensive view of a page’s speed performance and assist in identifying and executing on good optimization strategies.

Additional tools for analyzing page performance:

Lighthouse

Varvy (Pagespeed Optimization)

Supplemental readings:

Faster Sites: Beyond PageSpeed Insights

How a Webpage is Loaded and Displayed

Investigating rendering issues

How Googlebot (or Bingbot or MSNbot) crawls and renders a page can be completely different from what is intended, and typically occurs as a result of the crawler being blocked by a robots.txt file. If Google sees an incomplete or blank page it assumes the user is having the same experience and could affect how that page performs in the SERPs. In these instances, the Webmaster tool Fetch as Google is ideal for identifying how Google renders a page.

Located in Google Search Console, Fetch as Google allows you to test if Googlebot can access pages of a site, identify how it renders the page and determines if any resources are blocked from the crawler.

When you look up a specific URL (or domain) Fetch as Google gives you two tabs of information: fetching, which displays the HTTP response of the specified URL; and rendering, which runs all resources on the page, provides a visual comparison of what Googlebot sees against what (Google estimates) the user sees and lists all resources Googlebot was not able to acquire.

For an analysis application, the rendering tab is where you need to look. Begin by checking the rendering images to ensure both Google and the user are seeing the same thing. Next, look at the list to see what resources were unreachable by Googlebot and why. If the visual elements are not displaying a complete page and/or important page elements are being blocked from Googlebot, there is an indication that the page is experiencing some rendering issues and may perform poorly in the search engine.

Additional tools for investigating rendering issues:

Screaming Frog (Render Page tab)

Supplemental readings:

Fetch & Horror: 3 examples of how fetch & render in GSC can reveal big SEO problems

JavaScript & SEO: Making Your Bot Experience As Good As Your User Experience

Checking backlink trends

Quality backlinks are extremely important for making a strong web page, as they indicate to search engines a page’s reliability and trustworthiness. Changes to a backlink profile could easily affect how it is ranked in the SERPs, so checking this is important for any webpage/website analysis. A testament to its importance, there are several tools dedicated to backlinks analytics. However, I have a preference for the site Ahrefs due to its comprehensive yet simple layout, which makes it great for on-the-spot research.

An SEO tool well known for its backlink reporting capabilities, Ahrefs measures several backlink performance factors and displays them in a series of dashboards and graphs. While there is plenty to review, for most analysis purposes I find the “Backlinks” metric and “New & lost backlinks” graph to be the best places to focus.

Located under the Site Explorer tab, “Backlinks” identifies the total number of backlinks pointing to a target website or URL. It also shows the quantitative changes in these links over the past 7 days with the difference represented by either a red (negative growth) or green (positive growth) subscript. In a practical setting, this information is ideal for providing quick insight into current backlink trend changes.

Under the same tab, the “New & lost backlinks” graph provides details about the total number of backlinks gained and lost by the target URL over a period of time.

The combination of these particular features works very well for common backlink analytics, such as tracking backlinks profile changes and identifying specific periods of link growth or decline.

Additional tools for checking backlink trends:

Majestic

Moz Open Site Explorer

SEMrush

Supplemental readings:

How to Do a Link Audit in 30 Minutes

How Checking Your Link Profile Can Save Your Site

Creating your toolbox

This is only a sample of tools you can use for your SEO analyses and there are plenty more, with their own unique strengths and capabilities, available to you. So make sure to do your research and play around to find what works.

And if you are to take away only one thing from this post, just remember that as you work to build your own personal toolbox what you choose to include should best work for your needs and the needs of your clients.

Proposing Better Ways to Think about Internal Linking

Posted by on Jul 11, 2018 in SEO Articles | Comments Off on Proposing Better Ways to Think about Internal Linking

Proposing Better Ways to Think about Internal Linking

I’ve long thought that there was an opportunity to improve the way we think about internal links, and to make much more effective recommendations. I feel like, as an industry, we have done a decent job of making the case that internal links are important and that the information architecture of big sites, in particular, makes a massive difference to their performance in search (see: 30-minute IA audit and DistilledU IA module).

And yet we’ve struggled to dig deeper than finding particularly poorly-linked pages, and obviously-bad architectures, leading to recommendations that are hard to implement, with weak business cases.

I’m going to propose a methodology that:

Incorporates external authority metrics into internal PageRank (what I’m calling “local PageRank”) to take pure internal PageRank which is the best data-driven approach we’ve seen for evaluating internal links and avoid its issues that focus attention on the wrong areas

Allows us to specify and evaluate multiple different changes in order to compare alternative approaches, figure out the scale of impact of a proposed change, and make better data-aware recommendations

Current information architecture recommendations are generally poor

Over the years, I’ve seen (and, ahem, made) many recommendations for improvements to internal linking structures and information architecture. In my experience, of all the areas we work in, this is an area of consistently weak recommendations.

I have often seen:

Vague recommendations – (“improve your information architecture by linking more to your product pages”) that don’t specify changes carefully enough to be actionable

No assessment of alternatives or trade-offs – does anything get worse if we make this change? Which page types might lose? How have we compared approach A and approach B?

Lack of a model – very limited assessment of the business value of making proposed changes – if everything goes to plan, what kind of improvement might we see? How do we compare the costs of what we are proposing to the anticipated benefits?

This is compounded in the case of internal linking changes because they are often tricky to specify (and to make at scale), hard to roll back, and very difficult to test (by now you know about our penchant for testing SEO changes – but internal architecture changes are among the trickiest to test because the anticipated uplift comes on pages that are not necessarily those being changed).

In my presentation at SearchLove London this year, I described different courses of action for factors in different areas of this grid:

It’s tough to make recommendations about internal links because while we have a fair amount of data about how links generally affect rankings, we have less information specifically focusing on internal links, and so while we have a high degree of control over them (in theory it’s completely within our control whether page A on our site links to page B) we need better analysis:

The current state of the art is powerful for diagnosis

If you want to get quickly up to speed on the latest thinking in this area, I’d strongly recommend reading these three articles and following their authors:

Calculate internal PageRank by Paul Shapiro

Using PageRank for internal link optimisation by Jan-Willem Bobbink

Easy visualizations of PageRank and page groups by Patrick Stox

A load of smart people have done a ton of thinking on the subject and there are a few key areas where the state of the art is powerful:

There is no doubt that the kind of visualisations generated by techniques like those in the articles above are good for communicating problems you have found, and for convincing stakeholders of the need for action. Many people are highly visual thinkers, and it’s very often easier to explain a complex problem with a diagram. I personally find static visualisations difficult to analyse, however, and for discovering and diagnosing issues, you need data outputs and / or interactive visualisations:

But the state of the art has gaps:

The most obvious limitation is one that Paul calls out in his own article on calculating internal PageRank when he says:

“we see that our top page is our contact page. That doesn’t look right!”

This is a symptom of a wider problem which is that any algorithm looking at authority flow within the site that fails to take into account authority flow into the site from external links will be prone to getting misleading results. Less-relevant pages seem erroneously powerful, and poorly-integrated pages that have tons of external links seem unimportant in the pure internal PR calculation.

In addition, I hinted at this above, but I find visualisations very tricky – on large sites, they get too complex too quickly and have an element of the Rorschach to them:

My general attitude is to agree with O’Reilly that “Everything looks like a graph but almost nothing should ever be drawn as one”:

All of the best visualisations I’ve seen are nonetheless full link-graph visualisations – you will very often see crawl-depth charts which are in my opinion even harder to read and obscure even more information than regular link graphs. It’s not only the sampling but the inherent bias of only showing links in the order discovered from a single starting page – typically the homepage – which is useful only if that’s the only page on your site with any external links. This Sitebulb article talks about some of the challenges of drawing good crawl maps:

But by far the biggest gap I see is the almost total lack of any way of comparing current link structures to proposed ones, or for comparing multiple proposed solutions to see a) if they fix the problem, and b) which is better. The common focus on visualisations doesn’t scale well to comparisons – both because it’s hard to make a visualisation of a proposed change and because even if you can, the graphs will just look totally different because the layout is really sensitive to even fairly small tweaks in the underlying structure.

Our intuition is really bad when it comes to iterative algorithms

All of this wouldn’t be so much of a problem if our intuition was good. If we could just hold the key assumptions in our heads and make sensible recommendations from our many years of experience evaluating different sites.

Unfortunately, the same complexity that made PageRank such a breakthrough for Google in the early days makes for spectacularly hard problems for humans to evaluate. Even more unfortunately, not only are we clearly bad at calculating these things exactly, we’re surprisingly bad even at figuring them out directionally. [Long-time readers will no doubt see many parallels to the work I’ve done evaluating how bad (spoiler: really bad) SEOs are at understanding ranking factors generally].

I think that most people in the SEO field have a high-level understanding of at least the random surfer model of PR (and its extensions like reasonable surfer). Unfortunately, most of us are less good at having a mental model for the underlying eigenvector / eigenvalue problem and the infinite iteration / convergence of surfer models is troublesome to our intuition, to say the least.

I explored this intuition problem recently with a really simplified example and an unscientific poll:

The results were unsurprising – over 1 in 5 people got even a simple question wrong (the right answer is that a lot of the benefit of the link to the new page flows on to other pages in the site and it retains significantly less than an Nth of the PR of the homepage):

I followed this up with a trickier example and got a complete lack of consensus:

The right answer is that it loses (a lot) less than the PR of the new page except in some weird edge cases (I think only if the site has a very strange external link profile) where it can gain a tiny bit of PR. There is essentially zero chance that it doesn’t change, and no way for it to lose the entire PR of the new page.

Most of the wrong answers here are based on non-iterative understanding of the algorithm. It’s really hard to wrap your head around it all intuitively (I built a simulation to check my own answers – using the approach below).

All of this means that, since we don’t truly understand what’s going on, we are likely making very bad recommendations and certainly backing them up and arguing our case badly.

Doing better part 1: local PageRank solves the problems of internal PR

In order to be able to compare different proposed approaches, we need a way of re-running a data-driven calculation for different link graphs. Internal PageRank is one such re-runnable algorithm, but it suffers from the issues I highlighted above from having no concept of which pages it’s especially important to integrate well into the architecture because they have loads of external links, and it can mistakenly categorise pages as much stronger than they should be simply because they have links from many weak pages on your site.

In theory, you get a clearer picture of the performance of every page on your site – taking into account both external and internal links – by looking at internet-wide PageRank-style metrics. Unfortunately, we don’t have access to anything Google-scale here and the established link data providers have only sparse data for most websites – with data about only a fraction of all pages.

Even if they had dense data for all pages on your site, it wouldn’t solve the re-runnability problem – we wouldn’t be able to see how the metrics changed with proposed internal architecture changes.

What I’ve called “local” PageRank is an approach designed to attack this problem. It runs an internal PR calculation with what’s called a personalization vector designed to capture external authority weighting. This is not the same as re-running the whole PR calculation on a subgraph – that’s an extremely difficult problem that Google spent considerable resources to solve in their caffeine update. Instead, it’s an approximation, but it’s one that solves the major issues we had with pure internal PR of unimportant pages showing up among the most powerful pages on the site.

Here’s how to calculate it:

The next stage requires data from an external provider – I used raw mozRank – you can choose whichever provider you prefer, but make sure you are working with a raw metric rather than a logarithmically-scaled one, and make sure you are using a PageRank-like metric rather than a raw link count or ML-based metric like Moz’s page authority:

You need to normalise the external authority metric – as it will be calibrated on the entire internet while we need it to be a probability vector over our crawl – in other words to sum to 1 across our site:

We then use the NetworkX PageRank library to calculate our local PageRank – here’s some outline code:

What’s happening here is that by setting the personalization parameter to be the normalised vector of external authorities, we are saying that every time the random surfer “jumps”, instead of returning to a page on our site with uniform random chance, they return with probabilities proportional to the external authorities of those pages. This is roughly like saying that any time someone leaves your site in the random surfer model, they return via the weighted PageRank of the external links to your site’s pages. It’s fine that your external authority data might be sparse – you can just set values to zero for any pages without external authority data – one feature of this algorithm is that it’ll “fill in” appropriate values for those pages that are missing from the big data providers’ datasets.

In order to make this work, we also need to set the alpha parameter lower than we normally would (this is the damping parameter – normally set to 0.85 in regular PageRank – one minus alpha is the jump probability at each iteration). For much of my analysis, I set it to 0.5 – roughly representing the % of site traffic from external links – approximating the idea of a reasonable surfer.

There are a few things that I need to incorporate into this model to make it more useful – if you end up building any of this before I do, please do let me know:

Handle nofollow correctly (see Matt Cutts’ old PageRank sculpting post)

Handle redirects and rel canonical sensibly

Include top mR pages (or even all pages with mR) – even if they’re not in the crawl that starts at the homepage

You could even use each of these as a seed and crawl from these pages

Use the weight parameter in NetworkX to weight links by type to get closer to reasonable surfer model

The extreme version of this would be to use actual click-data for your own site to calibrate the behaviour to approximate an actual surfer!

Doing better part 2: describing and evaluating proposed changes to internal linking

After my frustration at trying to find a way of accurately evaluating internal link structures, my other major concern has been the challenges of comparing a proposed change to the status quo, or of evaluating multiple different proposed changes. As I said above, I don’t believe that this is easy to do visually as most of the layout algorithms used in the visualisations are very sensitive to the graph structure and just look totally different under even fairly minor changes. You can obviously drill into an interactive visualisation of the proposed change to look for issues, but that’s also fraught with challenges.

So my second proposed change to the methodology is to find ways to compare the local PR distribution we’ve calculated above between different internal linking structures. There are two major components to being able to do this:

Efficiently describing or specifying the proposed change or new link structure; and

Effectively comparing the distributions of local PR – across what is likely tens or hundreds of thousands of pages

How to specify a change to internal linking

I have three proposed ways of specifying changes:

1. Manually adding or removing small numbers of links

Although it doesn’t scale well, if you are just looking at changes to a limited number of pages, one option is simply to manipulate the spreadsheet of crawl data before loading it into your script:

2. Programmatically adding or removing edges as you load the crawl data

Your script will have a function that loads  the data from the crawl file – and as it builds the graph structure (a DiGraph in NetworkX terms – which stands for Directed Graph). At this point, if you want to simulate adding a sitewide link to a particular page, for example, you can do that – for example if this line sat inside the loop loading edges, it would add a link from every page to our London SearchLove page:

site.add_edges_from([(edge[‘Source’],
‘https://www.distilled.net/events/searchlove-london/’)])

You don’t need to worry about adding duplicates (i.e. checking whether a page already links to the target) because a DiGraph has no concept of multiple edges in the same direction between the same nodes, so if it’s already there, adding it will do no harm.

Removing edges programmatically is a little trickier – because if you want to remove a link from global navigation, for example, you need logic that knows which pages have non-navigation links to the target, as you don’t want to remove those as well (you generally don’t want to remove all links to the target page). But in principle, you can make arbitrary changes to the link graph in this way.

3. Crawl a staging site to capture more complex changes

As the changes get more complex, it can be tough to describe them in sufficient detail. For certain kinds of changes, it feels to me as though the best way to load the changed structure is to crawl a staging site with the new architecture. Of course, in general, this means having the whole thing implemented and ready to go, the effort of doing which negates a large part of the benefit of evaluating the change in advance. We have a secret weapon here which is that the “meta-CMS” nature of our ODN platform allows us to make certain changes incredibly quickly across site sections and create preview environments where we can see changes even for companies that aren’t customers of the platform yet.

For example, it looks like this to add a breadcrumb across a site section on one of our customers’ sites:

There are a few extra tweaks to the process if you’re going to crawl a staging or preview environment to capture internal link changes – because we need to make sure that the set of pages is identical in both crawls so we can’t just start at each homepage and crawl X levels deep. By definition we have changed the linking structure and therefore will discover a different set of pages. Instead, we need to:

Crawl both live and preview to X levels deep

Combine into a superset of all pages discovered on either crawl (noting that these pages exist on both sites – we haven’t created any new pages in preview)

Make lists of pages missing in each crawl and crawl those from lists

Once you have both crawls, and both include the same set of pages, you can re-run the algorithm described above to get the local PageRanks under each scenario and begin comparing them.

How to compare different internal link graphs

Sometimes you will have a specific problem you are looking to address (e.g. only y% of our product pages are indexed) – in which case you will likely want to check whether your change has improved the flow of authority to those target pages, compare their performance under proposed change A and proposed change B etc. Note that it is hard to evaluate losers with this approach – because the normalisation means that the local PR will always sum to 1 across your whole site so there always are losers if there are winners – in contrast to the real world where it is theoretically possible to have a structure that strictly dominates another.

In general, if you are simply evaluating how to make the internal link architecture “better”, you are less likely to jump to evaluating specific pages. In this case, you probably want to do some evaluation of different kinds of page on your site – identified either by:

Labelling them by URL – e.g. everything in /blog or with ?productId in the URL

Labelling them as you crawl

Either from crawl structure – e.g. all pages 3 levels deep from the homepage, all pages linked from the blog etc)

Or based on the crawled HTML (all pages with more than x links on them, with a particular breadcrumb or piece of meta information labelling them)

Using modularity to label them automatically by algorithmically grouping pages in similar “places” in the link structure

I’d like to be able to also come up with some overall “health” score for an internal linking structure – and have been playing around with scoring it based on some kind of equality metric under the thesis that if you’ve chosen your indexable page set well, you want to distribute external authority as well throughout that set as possible. This thesis seems most likely to hold true for large long-tail-oriented sites that get links to pages which aren’t generally the ones looking to rank (e.g. e-commerce sites). It also builds on some of Tom Capper’s thinking (videoslides, blog post) about links being increasingly important for getting into Google’s consideration set for high-volume keywords which is then reordered by usage metrics and ML proxies for quality.

I have more work to do here, but I hope to develop an effective metric – it’d be great if it could build on established equality metrics like the Gini Coefficient. If you’ve done any thinking about this, or have any bright ideas, I’d love to hear your thoughts in the comments, or on Twitter.

SearchLove London 2017 | Will Critchlow | Seeing the Future: How to Tell the Impact of a Change Before You Make it from Distilled

Meet the New Boss, Same as the Old Boss

Posted by on Jul 10, 2018 in SEO Articles | Comments Off on Meet the New Boss, Same as the Old Boss

Meet the New Boss, Same as the Old Boss

Seriously Google Search Console, why do you do this to me?

I want to like (trust) you, despite the historically numerous discrepancies in your data.

I’m delighted you are addressing the feature lag that SEO’s love to complain about.

But seriously, can we get internally consistent URL level data? Please?

 

The post Meet the New Boss, Same as the Old Boss appeared first on Local SEO Guide.

Top 7 Methods to SEO Optimize WordPress in 2018

Posted by on Jul 10, 2018 in Greg's SEO Articles | Comments Off on Top 7 Methods to SEO Optimize WordPress in 2018

If you are a WordPress site owner, you probably want to know how to optimize your WordPress sites to get the best results. WordPress is a unique platform, and you need to go about optimizing it for rankings in a different way than you would with regular sites.

However, the critical components of SEO are still the same as they would be on any other site. Much of the success you’ll experience for your WordPress sites is determined by the type of hosting plan and platform you have. Remember that WP is not a standalone platform. You can integrate your WP site within another web server. In fact, this is strongly encouraged.

We’ve compiled a list of 7 methods to help you optimize your WordPress site in 2018 and beyond. We have tried to include elements that may have changed in the past year or so, as well including how to get faster load times and other aspects.

1. Choose the Right Hosting Plan

As mentioned before, one of the most important things you can do to help boost your WordPress site is to find the right hosting plan. WordPress, though it can be a unique site by itself, often does better in the search rankings when you put it on a server that has its internal algorithms to help promote your search results.

You should do some shopping around when looking for the best server and check prices. But, most of all look into web servers that allow you to use internal optimization tools to increase your results in the rankings. You also need a server that has good bandwidth and the space to include video content, audio podcasts, and images within your content.

2. Have the Right Plugins

One of the main elements on the WordPress platform is their many plugins. These plugins serve to allow you to expand on the ability of your WordPress pages and to do more with your site. Some examples of great plugins offered by WordPress are listed below.

JetPack: allows you to keep up with your web stats, related posts, social sharing, and many other elements

Yoast SEO: This plugin lets you improve your content and to write better content using their third-party algorithms that check for readability scores

WooCommerce: This plugin allows you to connect to the eCommerce world by setting up a shopping cart and online payment platform that lets you do business with your WordPress site.

These are just a few examples of how WordPress plugins can help you increase the quality of your site and get more views. If you are after page 1 ranking in Google, this is an excellent way to improve your chances of doing this. That is because the more visits you get and the more popular your site becomes, the higher your ranking goes.

One other issue you need to keep in mind is how your plugins are performing on your site. If there is a glitch in one of your plugins or you have downloaded some third-party apps, you may want to delete the ones that are causing problems. They can slow down your site if they are not performing well and can decrease your site performance and traffic.

Remember no plugin is worth losing traffic, so make sure all of your plugins are performing well and increasing your site traffic. If they are not, cut them out.

3. Create attractive blog posts with media.

People are visual. If you can create more attractive blog posts complete with media such as video content, pictures, and podcasts, you’ll get more views, increase your engagement, and you may be on your way to page 1 ranking. You cannot get there with text alone, and your blog should be visually appealing to pass this test.

4. Be a spam detective.

Remember in the past few Google algorithm updates that Google reminds us constantly that they do not like spam! Black Hat techniques are out. You need to focus only on the most ethical means to get your traffic, even if it takes longer to build up your audiences. You may win a customer for one day, but that customer won’t be back if they figure out they’ve been tricked. So include only relevant content on your website, and any backlinks and trackbacks should be free of extemporaneous material and irrelevant links and spam.

5. Use a Fast WordPress Theme

Did you know that even your WordPress theme could be slowing you down? That’s right. WordPress themes are great, but some take much longer to load than others. Part of this is due to the number of images or plugins. Other issues in the theme may have to do with the format of the theme itself and how it is arranged on the page. Try out different themes and make sure your site and theme are not slowing down your page load time. This affects your page ranking, too because Google’s bots search for content that is fast and loads quickly to index the pages.

6. Optimize your images for WordPress and Google

Whenever you put images on your site, you are adding another trackback feature to your page. That’s good for rankings, but only if your images are correctly optimized. When you have a WordPress site, you need to optimize your images for both WordPress and Google. Start by making sure your images are loading correctly on WordPress. If they take too long to load, you may need to reduce the size for your platform so that the load time will be diminished.

WordPress automatically gives you a variety of sizes to choose from when you upload an image to their platform. However, this is only the display size. To change the actual file size requires some sort of offline or online editor.

Then you need to optimize images for Google. You can read about Google’s recommended image sizes here. Any image will load but obviously, the file size will determine how fast it loads. You may need to experiment with various image sizes to see which size is optimal for your server and your site.

7. Choose advertisements wisely

Another important consideration you should think about when developing your WordPress sites is to carefully consider your advertising options. Larger ads and some customized ads or even Google Ads can interfere with the load times and efficiency and speed of your site. Why is this important? Because Google ranks sites according to many factors, one of which is the speed of loading.

If it takes your site too long to load when people click on your links, you will not only upset Google. You’ll upset your potential customers and clients. People don’t like to wait to see the content and many are coming in from mobile devices. They are in a big hurry. Let’s face it: we all are! So focus on creating content that is interesting and valuable while not overloading it with ads or other content that slows down your site load time. Consider embedded links to ads rather than large banner ads, for example. This one thing can make a big difference in the customer’s experience.

Search engine optimization is always a subtle balance between using keywords within your site, as well as in your blog, YouTube videos, and any other assets you have out there to drive traffic and to get your message out to the people who are most likely to respond to your content.

High Quality is Job One

Writing high-quality, valuable content is the main way to create a great WordPress site. But you should keep SEO in the back of your mind as you build your content and always remember how this works to maximize the impact your brand will have.

In short, you need to appeal to both search engines and people.

How do you do this? You appeal to people by writing high-quality content that will engage them once they are on your site. And you utilize high-quality and long-tail keywords to get them there. Think like the searcher and consider what words you would type if they were looking for the types of products and services that you offer.

When you can successfully match up search terms with the results on your page, you’re on your way to success. Maybe you’ll even see your site with a page 1 ranking.

If you have a site on WordPress, congratulations! WP is the most used platform for blogs, websites, and online publications today. It is predicted that around 33% or more of all sites are now on this platform. You just have to think about things a bit differently when dealing with the components of your WordPress site and adjust your content marketing strategy accordingly.

Some Final Thoughts

All of these tips are great if you are trying to figure out how to optimize your site through search engine optimization. Once you have your WordPress site on a web server, you will go through the process of optimizing, advertising and promoting your site in similar ways that other site owners do. However, it is important to remember that WordPress is a unique platform with its own internal algorithms and perks, so you must keep all of these elements in mind when promoting your site.

Follow the tips we’ve offered above and let us know if you are successful with any other tips that you think of as you go! We love to hear about your success.

Can Negative Brand Mentions Hurt Your Rankings? The Answer Might Surprise You!

Posted by on Jul 10, 2018 in SEO Articles | Comments Off on Can Negative Brand Mentions Hurt Your Rankings? The Answer Might Surprise You!

Can Negative Brand Mentions Hurt Your Rankings? The Answer Might Surprise You!

We all keep hearing that any publicity is good publicity. But is it really so? And, more importantly, is it the same for SEO? We’ve all heard about the link building technique using unlinked Brand Mentions  It’s a good method and, as far as we know, it works very well.

 

But what if the unlinked brand mention that you’re so excited about turns out to be a RipoffReport post? What if it’s a blogger complaining about your services, or a news publisher reporting on some incident regarding your brand?

 

 

How Web Brand Mentions Affect SEO Today

Brand Anchor Text Backlinks
Unlinked Brand Mentions or Google Implied Links
Branded Search Queries

How Does Google Treat Negative Brand Mentions

Google’s Search Quality Evaluator Guidelines
How The Google Algorithm Works

Will Negative Brand Mentions Affect Your Google Rankings?

Is All Publicity Good Publicity?
Handling Negative Mentions From a Search Point of View
Handling Negative Mentions From a Backlinks Point of View
The Role of Artificial Intelligence

How to Deal With Negative Brand Mentions

How do you find Your Negative Mentions

 

Should you also get a link from this kind of sources as well? Will it help you or will it harm your website’s rankings? Well, you’re about to find out!

 

In this article we will dissect how Google treats this matter and find out if negative brand mentions impact your rankings and SEO efforts. So, keep reading.

 
1. How Web Brand Mentions Affect SEO Today

 

In general, we all know that a good branding and PR will positively affect our business. It makes sense. People love well established brands with a lot of satisfied customers and great deals.

 

For example, most people would still buy an iPhone over a Xiaomi Phone, although some models apparently have similar characteristics and lower prices. People buy from certain brands because of trust and authority. To obtain the audience’s trust, a brand must be well established and have good PR.

 

For your business to flourish online, besides earning the customers’ and audiences’ trust, you also have to win the search engine’s trust.

 

Customer and search engine trust kind of go hand in hand, but, from what we know, SEO works based on some algorithms. There are many important factors which lead to authority, from domain age and security to backlinks, CTR and branded searches. We’ll ignore domain age, as it’s not really something we have control over, but we’ll be focusing on the other ones.

 
1.1 Brand Anchor Text Backlinks

 

As presented in this link building lessons article, websites with a high percentage of branded anchor text backlinks tend to rank higher than the ones with a high percentage of commercial anchor text backlinks.

 

Screenshot from the cognitiveSEO Tool showing anchor text distribution for a top ranking website in its niche

 

The cause for this phenomenon is probably Google’s association of these links with naturalness. When talking about you on the web, people are more inclined to link to your website through a branded or URL anchor text, rather than through a commercial anchor text.

 
1.2 Unlinked Brand Mentions or Google Implied Links

 

A website can also have a high number of unlinked brand mentions across the web. They are often called citations and are very popular within the local SEO field.

 

While these probably don’t provide as much value for SEO as backlinks do, they certainly have an impact. Google actually has a patent on that so we know it’s true. It defines them as implied links:

 

Screenshot from Google’s patent defining implied links

 

The key element here is relevancy. The citations you’re getting to your local business should reflect the location of your business. If you’re in Seattle but get citations from Russian websites that mostly mention Russian businesses using Cyrillic characters, don’t expect your rankings to grow very fast.

 

On the other side, if your business targets a global market, then it makes sense to have all sorts of unlinked mentions across different countries. Relevant content is also important, so don’t spam on every site.

 

Also, keep in mind that mentions that do contain backlinks, especially dofollow ones, have a lot more impact on your search engines than mentions that don’t.

 

How unlinked mentions compare to linked mentions.
That’s a chainmail armor. Get it? … couse … it has links? (ba dum tsss)

 

So, generally speaking, Google does take a look at your unlinked brand mentions and the content surrounding those mentions, but it’s better if you can get a link back to your website from them. That way you can be sure that you maximize the potential of the mention, as it’s both a mention and a link at the same time.

 

And remember, you want to get mentions from reputable websites that are relevant to your niche and geographic targeting, be them linked or unlinked. Creating them yourself using the same IP might turn out to be a waste of time and could actually get you into some trouble.

 

A great way of finding your unlinked mentions is Brand Mentions, a web mention monitoring tool. It has a lot of advanced features, but this is one of the best, as it enables you to establish a connection with webmasters that mention you and potentially acquire a link.

 

Screenshot from the Brand Mentions tool showcasing the potential to discover unlinked web mentions

 
1.3 Branded Search Queries

 

We can categorize search queries into two types: branded and non-branded. A non-branded search query would be “keyword tool” and a branded search query would be “cognitiveSEO keyword tool”.

 

You want to be ranked on both of them. If you’re not ranking for your own brand name, that is kind of bad. You either chose the wrong brand name, or you’re heavily penalized by search engines.

 

Each of the two has a catch:

 

Non-branded keywords, for example, have a high search volume, which means a lot of potential customers. However, there’s a lot of competition there, so they are harder to rank for and the CTR might be lower.

Branded Keywords, on the other side, have probably a low search volume, but they are very easy to rank for (if you are that brand) and the conversion rates are very high, as usually the users already know what they want.

 

 

Although this is speculative, some experiments and observations (including some of our own) have shown that branded keywords can eventually increase the rankings for non-branded keywords. So, if a lot of people search for “your brand + keyword” and click on your website, you should also see an increase in rankings on the “keyword” itself.

 

Rand Fishkin also agrees on this and explains it in his following video of the Whiteboard Friday:

 

 

These being said, it’s always a great idea to work on your PR so that people search for your brand on the web and associate products with your brand. Not only will it increase conversion rates, but it can also help with rankings on non-branded keyword versions.

 
2. How Does Google Treat Negative Brand Mentions

 

Ok, so we know that Google likes branded anchor text links and looks at brand mentions across the web to establish your website’s authority.

 

But, in real life, having more mentions isn’t necessarily better. What if you have thousands of negative reviews, mentions related to fraud or other bad stuff? That could affect you business deeply, or even send it into bankruptcy.

 
2.1 Google’s Search Quality Evaluator Guidelines

 

Google’s algorithms can’t yet determine what is good and what is bad. That’s why the company has set up a team of humans (around 10,000 or so) to take care of the job. The initiative doesn’t only target brand mentions. The purpose of these evaluators is to help improve the quality of the search results in general.

 

Unlike the manual spam team, which can apply manual actions on websites, the search quality evaluators can’t directly influence search rankings. Instead, they follow these guidelines to evaluate websites and provide feedback about the search results to Google.

 

From the guidelines, we can observe that Google demands the lowest rating to be given to websites with negative or malicious reputation.

 

One criteria would be harmful pages that contain malware or that are phishing for information (page 37), but we can also see on page 43 that the evaluators should consider low BBB ratings and other sources that expose fraudulent behavior.

 

Screenshot from the Search Quality Guidelines

 

We can also include the issue of Fake News. Fake News websites purposely create these fake stories as shocking as possible, not to influence people necessarily, but to gain hype on social media, which brings traffic to their websites which eventually makes them money from ads.

 

 

These fake stories can ultimately even rank high in Google for specific keywords. Fake News have a big impact on Political Campaigns, as messages spread through Fake News often influence the decisions of uninformed and naive voters.

 

You can watch the following video for reference about why fake news are so popular:

 

 

So far, it’s a known fact that social media hype, backlinks and a lot of mentions can get you ranking at the top. Unless the story is critically dangerous, such as one that incites to violence, it can rank there for a very long time without any manual action being taken.

 

Google also has an initiative in which they will work closely with journalists to combat fake news. The search quality evaluators also play an important role, as they can provide patterns that fake news websites use, which can help the algorithm raise a red flag much easier.

 

However, the subject is a lot more sensitive. Malaysia, for example, has passed a law against fake news which could be punished with over $100,000 in fines and up to 6 years of jail time. This raises questions about free speech and corruption (in that particular case). Same thing goes with Google, as they would literally have to censor some websites if they consider them to be fake news websites by not giving them a chance to rank for certain keywords.

 
2.2 How The Google Algorithm Works

 

After the evaluators provide their feedback and Google compiles the info, it gets fed to the algorithm. This way, over time, Google starts understanding the patterns of fake news or malicious websites and websites with a very bad reputation.

 

It’s hard to tell what exactly the algorithm will be based on, but I can speculate looking for reviews in structured data or similar implementations, articles related to fraud or other criminal activities and even news sites or videos. It can then follow patterns, affecting an entire server or network of similar sites related in any way with the initial penalized low quality/reputation website.

 

One thing’s for sure: Google’s algorithms will be constantly improving and will eventually be able to understand if a business has a very bad reputation or if it’s involved in any suspicious or criminal activities.

 
3. Will Negative Brand Mentions Affect Your Google Rankings?

 

If we’re only talking about a small part of your customers leaving negative reviews or a couple of bloggers trashing you for not accepting their promotion offer, there’s nothing to worry about. In fact, it might actually benefit you if you can get backlinks.

 

However, if we’re talking about massive PR drops, thousands of complaints, serious fraud and the press going crazy over it, then yes, it probably will. But not in the way you might think it will. To be more exact, it won’t directly impact your rankings, but they will ultimately drop because of other indirect factors.

 
3.1 Is All Publicity Good Publicity?

 

If you know about the White Moose Cafe, then you probably know about all the scandals regarding the vegans and the bloggers. If not, here’s some input:

 

So there are actually two stories. One is about some vegan complaining about the restaurant having no vegan food in it. The vegan community started blasting the Hotel’s (yes it’s actually a Hotel not a Cafe) Facebook page with 1 star reviews. The hotel then started mocking the vegan community.

 

 

This created a lot of hype with more vegans turning red and massively hitting the 1 star review button but, on the other side, there were a lot of people defending the White Moose Cafe.

 

More recently, a similar story took place after a YouTube content creator asked for free accommodation in exchange for exposure on her channel. The Hotel’s owner mocked her and the bloggers community started rolling in with 1 star reviews. The owners now posted about banning all bloggers from their hotel, just as with vegans, above.

 

Again, a lot of people were defending the White Moose Cafe and, after all this hype, the hotel’s Facebook Page still stands at 4.4 out of 5 stars.

 

The White Moose Cafe kept pumping their jokes by sending the blogger a bill for publicity services, topping over $6 million. While it might sound exaggerated, it’s actually pretty accurate. Considering the amount of hype the posts got around social media and how many news publications caught up on the story, the price seems fair. The blogger actually gained a fair 10,000-20,000 subscribers and most of her complaint videos were monetized.

 

 

Paul, the owner of the White Moose Cafe, actually has a great post about this in which he elaborates upon the topic and explains how controversy can be a great way of promoting your business, as long as you don’t get trashed completely (what if it was all actually a setup and they actually worked together?). The secret is to also have one side defending your point of view. Another thing you can do is simply ask questions instead of expressing an opinion.

 

While I don’t actually agree with trolling and insulting other people for publicity, I have to admit that I find it kind of funny at the same time, being both a vegan and a blogger myself. Paul has some strong points of view and reading through his blog you can actually see that he’s not a bad person.

 

Now these aren’t really negative brand mentions, as overall, the hotel seems to have gained more than it lost. But there are some that might actually get you into trouble.

 
3.2 Handling Negative Mentions From a Search Point of View

 

When users search the web, they often look for reviews. If it’s a product, they look for other people who have previously bought and used it. If it’s a service, they look upon other client’s experiences.

 

Reviews have a big impact on the purchase decision of a client, as the following survey shows:

 

 

Now some people are aware that not all negative reviews are accurate, but there are some review sites that are very trustworthy or, better said, credible. One of them, for example is RipoffReport, which lists scams, usually in the digital marketing field. People can comment on them, so usually there’s a community of people complaining instead of just an individual.

 

I had this user interested if they can make a RipoffReport post mocking one of his clients rank lower, as it was occupying one of the first 5 positions on their main brand search.

 

Screenshot from Intercom showing discussion. I censored the user’s name, photo and website for protection.

 

I told him that the only thing he can do is get the post removed somehow, although I knew he didn’t have any chances. I can’t show off the Google results as I’d probably have to black out 99% of the image, but you have my word, that RipoffReport is ranking high for their brand name and, considering the things said in that post, it doesn’t look very good for the company.

 

If you’re in a similar situation or your overall number of reviews is under 3 stars, then you should consider taking serious action on the quality of your products / services. Sometimes it might not be entirely your fault. For example, people might be complaining about slow shipping. But if it is so, then change the shipping company you work with.

 

People really care about reviews and if they avoid your website because of this, it kind of resembles a bad user experience, which we know Google cares a lot about and could potentially affect your rankings.

 

If this also results in a lower number of branded searches for your website, then it might also have a negative impact on your rankings.

 
3.3 Handling Negative Mentions From a Backlinks Point of View

 

As long as the link isn’t coming from malicious websites with malware or a very spammy link profile, I can’t really see how they could directly affect your rankings.

 

If someone wants to give you as a negative example, they might or might not link to you. However, if they do link, they will probably will do it through nofollow links which will still benefit your rankings.

 

So far, we don’t have any data on links surrounding negative mentions, but we do know that links from malicious or spammy websites are bad.

 

The only way they could affect you is that users from the article that speaks badly about you might be able to land on your website, resulting into more negative reviews.

 
3.4 The Role of Artificial Intelligence

 

Eventually, we will reach a point where Google will become true AI. At that point, whatever the search quality evaluators have done so far, Google will be able to do it itself, probably an infinite times better. Fake news will be a thing of the past and the bot might even be able to distinguish between true fake news or hate speech and offenses and jokes.

 

Image Source: popsci.com

 

We’re not quite there yet, but it’s definitely something that Google’s trying to achieve. At that point, we’ll probably also reach the singularity and Google will have already found a solution to world hunger and all human problems so we won’t have to worry about SEO anymore.

 
4. How to Deal With Negative Brand Mentions

 

Well, the best way of dealing with them is not having them…
No, seriously, if your company is convicted for some sort of crime, you probably have bigger problems on your head. That’s not going to be easy to erase and even if you rebrand yourself, through the power of social media and social justice warriors you can’t stay undercover for long.

 

While you should constantly improve your customer care sector, there are some natural negative reviews and brand mentions that you can’t avoid with 100% accuracy. People are more likely to leave a review after a negative experience than they are after a positive one, so you’re already at a disadvantage.

 

The best way of dealing with negative reviews is to answer them, deal with them, learn from them and improve your services.

 

Sometimes, people are jerks. They rate your product or service 1 star for no real reason. For example, I’ve seen people just saying “It’s a bad product” without specifying why and others leaving a 1 star reviews because it didn’t match their expectations, comparing a low end product with a high end one, even though it was a good value for the price.

 

However, by explaining your point of view in the comments you have the opportunity to get your side of the story listened to. For example, I’ve recently read a comment in which one client was complaining about the CPU not matching his socket and that he sent it back but the store wouldn’t accept the return. The store replied and said that the CPU’s pins were damaged and that they even tried to repair it themselves but didn’t succeed.

 

By addressing the issues in the negative reviews, not only can you turn a customer’s frown into a smile but you can also show the other clients that you care, that you are sorry for what happened and that you are willing to learn and improve.

 
4.1 How do you find Your Negative Mentions

 

It’s hard dealing with negative brand mentions if you can’t track them, don’t you think? Well, Brand Mentions actually has a great feature that can sort out the good from the bad and make your life a lot easier.

 

Reputation management from Brand Mentions

 

So, if you want to manage your reputation you can always use the sentiment analysis feature, segment your mentions and deal with them a lot better.

 

Conclusion

 

So, in the end, it seems like negative brand mentions don’t directly impact your search engine rankings. However, if these mentions result in lower branded searches, or if other news sites rank above you on branded queries, then you might eventually suffer some SEO consequences as well

 

Have you ever experienced bad PR or negative brand mentions? If yes, did it affect your rankings? Did you manage to fix the issue? If yes, how? If not, maybe we can discuss it and find a solution. Feel free to start a discussion in the comments section below.

 

The post Can Negative Brand Mentions Hurt Your Rankings? The Answer Might Surprise You! appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

Do Your Google Keyword Rankings Suck? Do This.

Posted by on Jul 10, 2018 in SEO Articles | Comments Off on Do Your Google Keyword Rankings Suck? Do This.

Do Your Google Keyword Rankings Suck? Do This.

Every business wants to increase Google keywords rankings, but most struggle to do so.

Why is that?

Because most people trying to do SEO throw sh*t at the wall and hope for the best.

Those days are over.

Today, I’m going to show you how to increase your keyword rankings in Google step-by-step. Just follow my lead and you’ll start seeing improvements (as long as you put in the work).

Let’s start.

6 Ways to Increase Your Google Keyword Rankings
1. Qualify Your Target Keyword(s)

One of the biggest issues I find is that people try to rank for keywords that are way too competitive (use this keyword research guide if you don’t have any keywords yet). For example, new websites shouldn’t be targeting competitive keywords.

That’s why you must go through the process of first qualifying your keyword based all the available data you have. Then, you need to analyze the competition for your target keyword to make sure it’s worth the time, resources, and effort.

This brief analysis will save you a lot of headache down the road. Let me start with the first part of this process, qualification.

Keyword qualification simply means confirming that your target keyword actually has interest. The most obvious qualification metric is monthly search volume that you can find within the Google Keyword Planner.

Other tools like Keywords Everywhere, SEMRush, and Ahrefs will also give your rough search volume estimates.

The reason why you don’t want to solely rely on search volume estimates is because:

They aren’t 100% accurate
Most people base their entire keyword research strategy on it and..
There are other ways to qualify ideas outside of search volume. Those include user signals, social engagement, trends (from Google Trends), and even backlinks.

These are “signals” that your topic is qualified.

Watch this video to see exactly how to qualify your keywords:

.embed-container { position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden; max-width: 100%; } .embed-container iframe, .embed-container object, .embed-container embed { position: absolute; top: 0; left: 0; width: 100%; height: 100%; }

After you’ve established that your keyword is worth your effort, you have to analyze your competition. There are two types of competitor analysis. The first is what I like to call the 10-second competitor analysis.

There are two ways to perform this short analysis. You can simply enter your target keyword in the Ahrefs Keyword Explorer tool.

Ahrefs will then give you estimated of the “Keyword Difficulty”.

I’ve actually found this to be pretty accurate.

The other method is scan the first page of Google results for your target keyword. You’ll either need to install the Moz or Ahrefs toolbar to do this.

I recommend reading my SEO content guide because I explain exactly what you need to look for.

If you keyword passes the qualification and competitor test, then you’re ready to move onto the next step.

2. Audit Your Site

Poor technical and UX performance can plague your site’s performance. That’s why it’s fundamental that you audit your site to make sure it’s on a strong foundation.

I recommend reading my SEO audit guide because it will show you every factor you need to look for.

The truth is most people don’t take technical SEO and User Experience (UX) seriously. This is a big mistake.

You have to remember that even if you do successfully improve your keyword rankings in Google, it’s not a guarantee that they will stick. The “stickiness” of your rankings will be determined by how users are interacting with your page and website as a whole.

Most tactics you’ll learn in this guide will increase your organic search traffic, but solid technical SEO and UX is what will keep users happy and keep them on your website.

After you’ve performed a site audit and cleaned all technical and UX issues, it’s time to dissect the page you’re trying to rank.

3. Analyze Your Target Page

You can get keyword research and technical SEO right, but if your target page sucks, you won’t rank well.

Nothing is more important than the actual page you’re trying to rank in Google.

If your target page doesn’t at least match the quality of what’s already ranking, then you have some work to do.

My philosophy for creating keyword-targeted pages is that the page must be substantially DIFFERENT and 10x more valuable than what currently exists.

Of course, users are who will decide what’s valuable, but you need to at least operate with these standards.

That means you need to be honest with yourself:

Did you put in as much effort as your competitors?
Does your page bring anything new to the table or is it just rehashed information that already exists?
Is your page 10x more valuable than theirs?

These questions are easy to answer when you’re analyzing blog-driven search results.

But what about when you’re a local business?

Let’s say you’re a plumber and you want to rank for “best St. Louis plumber.”

The first thing you would want to do is look the ranking results and determine what the search intent is. From a customer cycle perspective, it’s obvious that the search query “best St. Louis plumber” is in the investigative or possibility transactional phase. This means the searcher is likely looking to hire a plumber in the near future.

But one other element you need to analyze is HOW are your competitors ranking.

In the case of “best St. Louis plumber”, you’ll notice that the majority of the local pack results are using homepages.

This is an indication that you should do the same.

But then the question is:

How do you create a homepage that is different and more valuable than what’s ranking?

It all starts with your service.

What makes your plumbing service different and better than your competitors?

Then after that, you need to think how you can be more helpful than your competitors.

What information can you give your customers to help them make a better decision?
What expertise can you give them in advance to prove that you’re the authority and plumber they should pick?

Lastly, focus on the design and UX for your homepage. The better the experience, the more likely a prospect will favor your company.

My favorite way to spice up a homepage is to add video content. Video is a powerful sales tool, but it can also help your SEO performance because it increases user dwell time on your page.

I recommend reading and taking action on my SEO content guide to understand the concepts I’ve mentioned on a deeper level.

Now I want to show you one of the quickest ways to improve your rankings (after you’ve taken the previous steps).

4. Leverage Existing Authority

As soon as I publish a new keyword-targeted page, I immediately look for opportunities to inject internal links to it. This is the fastest way to send authority to a page, which will often give it an immediate boost in rankings (once Google recognizes the changes). My order of operations is usually to add internal links on my most authoritative pages. Here’s how you can do it:

Go to Ahrefs, click on Site Explorer, enter your URL and start the analysis.

Then click on “Best by Links” under the pages section.

Ahrefs will then show you what pages on your website have the most backlinks.

These are the pages you want to add internal links to if it makes sense. This strategy will give you nice boost in most scenarios, but usually won’t be enough to give you a dramatic push in rankings.

That’s why you need to then go through the process of acquiring more backlinks.

5. Acquire More Backlinks

Backlinks are one of the most important factors for increasing your Google keyword rankings. I recommend reading my guide on backlinks because it will give you the foundation you need for understanding how to do effective white hat link building.

I’m someone who prefers content-centric link building strategies.

My strategy is simple:

Create incredibly valuable content assets that people will WANT to link to.

The SEO content guide I mentioned above is a good place to start.

There are some link building tactics that I leverage that do not have content dependencies.

Some of those include:

Guest posting
Link lists/resource pages
Expert roundups
HARO submissions
blogroll placements

It’s important to note that each of these link types requires different anchor text distributions to avoid getting penalized. For example, you should only use branded anchor text for blogroll link placements. That’s because most blogrolls are site-wide and you don’t want a keyword-rich anchor being distributed across thousands of pages.

Acquiring backlinks is a big topic and that’s why I’ve dedicated an entire page to it. Make sure you check out the guide above to learn more.

The last piece of the puzzle for improving your keyword rankings in Google is simple:

6. Be Patient

Most SEOs don’t talk about this because they usually have clients breathing down their throat. No client wants to hear “you just need to be patient” when they’re spending money on your SEO services. But it’s the truth.

In my experience, most SEO campaigns pick up steam around the 4-6 month mark, but this isn’t a rule. Every campaign is different and therefore unfolds in its own unique way.

Just stay the course. Keep creating value, acquiring quality backlinks, and caring about user experience. Do these things continually and your rankings will improve without risk of getting penalized.

It’s funny because I believe that impatience is the root of all SEO evil. People get sick of waiting for results, so they decide to jump into the grey or black hat world. I used to live in that world and I used to get results fast. But it doesn’t matter how quickly you get results, when they’re erased overnight.

I would rather take the slow and steady approach and build a long-term business. Don’t look for shortcuts because there aren’t any. Businesses that succeed with SEO in the long-term are those that add the most value to the Internet (with their target niche).

Last Word on Google Keyword Rankings

People still obsess about Google keyword rankings, but they aren’t the most important Key Performance Indictor (KPI). The most important KPI for an SEO campaign is ORGANIC SEARCH TRAFFIC.

Traffic is the goal of SEO. Not keyword rankings. Don’t forget that!

Now of course, you need to rank for keywords to get organic search traffic. But just to remember to prioritize traffic growth over individual keywords rankings as your primary KPI.

With that said…

Do you want better keyword rankings and more traffic from Google?

You need to join our free SEO 101 course. This course will first show you how to convert the traffic you’re already getting. Then, I’ll show you how to get more organic search traffic from Google.

You have nothing to lose when you enroll today because it’s free.

Enroll  https://www.gotchseo.com/seo-101-course