Blog

10 Answers To Why do You have a high bounce rate

Posted by on Jul 16, 2018 in SEO Articles | Comments Off on 10 Answers To Why do You have a high bounce rate

10 Answers To Why do You have a high bounce rate

10 Questions to Ask Yourself to Get Lower Bounce Rate?

I’m gonna start off by asking you a question, do you know how high is considered high?

What is good bounce rate? Here’s the benchmark average bounce rate according to web page types by Quicksprout.

The reason we’re looking at this is that having a high bounce rate doesn’t necessarily means bad. Google won’t push your page to the depths of the SERP if you got a 90% bounce rate.

Here’s the thing, Google doesn’t even take the bounce rate as it is in Google Analytics as a ranking factor. In fact, they don’t use any of the numbers that you can pull from Analytics for ranking.

However, it doesn’t mean that you shouldn’t worry about a high bounce rate.

Remember this, bounce rate is personal and individual. Take our site as an example, our blog posts usually have a bounce rate around 80% while our landing pages have an average of around 60%.

Our blog has an average bounce rate of around 80%.

Why? Because they have different purposes. Look at your own browsing pattern. Would you stay longer than necessary at a website once you’ve got what you want? Nah, neither would I. That’s what blog posts give, answers and suggestions that people need. When they get it, they’ll leave. So having a higher bounce rate not necessarily mean a page has not achieved its purpose.

Therefore, the correct way to look at bounce rate is by aligning it with the purpose of your web page.

A landing page has no value if visitors don’t engage further. Same could be said for a homepage whose purpose is to guide visitors to explore the website. If those pages have more than 65% of bounce rate, it is definitely not going well.

Before we launch into more details on why high bounce rate happens, are you certain that you understand what a bounce rate is?

What does bounce rate mean?

No, no, Mr. Bean. Not that kind of bounce.

Bounce rate is one of the main data that you can pull from Google Analytics. A bounce happens when there is a single interaction visit to a website. While the bounce rate is the percentage calculated by dividing single-page sessions by all sessions.

Google dictionary defines bounce rate as “the percentage of visitors to a particular website who navigate away from the site after viewing only one page.”

Let’s break it down a little.

Percentage: dividing single page session by all sessions times 100%
Navigate away: exits the site
Viewing only one page: single engagement HIT (engagement includes: pageviews, events, e-commerce items, e-commerce transactions, social, and other user-defined actions)

If you are interested to dwell in the detailed technicality behind a bounce rate here is a good read on demystifying Google Analytics bounce rate.

A visitor can click on your website, spend 10 minutes reading your blog post, exits and that would count as a bounce. How could you say having your 30 hours crafted article being read from the first line to the last as NOT engaged?

The point is, bounce rate should be treated as a gauge in accordance with the purpose of a page. If your web pages have an alarmingly high bounce rate which doesn’t align with the purposes, then something has to be done. Be it a UI redesign, or changing your content marketing strategies.

Here are 10 questions to ask yourself if you want to know how to lower bounce rate.
1. Have you set up a clear Call-To-Action?

An example of a clear call-to-action button.

Another example of a clear call-to-action button.

If you end a transaction with “Thank you.” well that probably means goodbye. However, if you end a transaction with “We’ll be having a flash sales next Wednesday, hope to see you then.” I’d say you’ll see the customer again.

The same thing can be applied to your website. If you want visitors to advance forward in your website, you need to guide them to do so.

Look at your page, is there any clear and urgent CTA? Or are there so many CTA that it risks confusing the visitors? Every web page should only have a maximum of two CTA.

Here are some examples: put up a “Start your free trial” at the of your product description page, “Subscribe for future posts” at the end of your blog posts or a “sign up” at the right upper corner on every single page.

2. Does your site have an always visible navigation bar?

An example of breadcrumbs navigation.

Ever scrolled all the way down to the end of the page, then have to scroll all the way back up to the navigation bar? I’d just give up halfway and click exit.

If you’re not gonna make it easier for visitors to navigate around your website, you’re really just not making it easier for yourself.

Be it a breadcrumb, a navigation bar or a quick return to the top button. Implement a navigation system that would ease the navigation process of visitors. Therefore, encouraging them to explore different pages instead of leaving.

A handy back to top button could improve the user experience tenfold.

3. Are you talking to your targeted persona?

I’ve talked about creating a persona for your targeted audience before. Keep that in mind all the time. Be it when you’re writing blog posts, putting up new products or creating a CTA.

Aligning with your page purpose, should your copies sound convincing, exciting or helpful? Are you addressing their pain points? More importantly, do you sound respectful?

Stephanie who has a 2-year-old child might be more attracted if you talk to them kindly and be helpful by keeping instructions clear. While Peter will want to have the most details about the camera you’re trying to sell and will feel more at ease at a tone that is just as excited as he is about the newest gadgets.

Stephanie would appreciate it if your keep your instructions short and clear.

Sympathize with your targeted personas and put yourself in their shoes. What will they need? What will make them stay? Answering those questions can help lower your bounce rate.

4. Is your website mobile friendly?

Utilize Google’s mobile-friendly tool to check your website’s rating.

A smartphone is the primary internet browsing device for one-third of people, take me for example, I don’t even turn on my laptop after work. Mobile search is becoming increasingly prominent, and I can never stress enough on being mobile-friendly. You can opt for responsive website design or the AMP framework. Having a responsive website means your website is adaptive to desktop, tablet, and mobile. While AMP is a mobile targeted format.

Check out the number of mobile visitors on your Google Analytics, you might be surprised. Take our site, for example, even though 80% of our visitors are from the desktop, we do have slowly rising mobile users. That’s why we make sure that our site is responsive and would show up correctly regardless of the visitor’s device of choice.

Our website has a slowly rising number of mobile user.

You should not ignore the potential of mobile users. If your website is not loading and displaying properly, visitors will not hesitate to abandon it. Being abandoned will not only hike up your bounce rate but also your exit rate.

All in all, being mobile friendly will be encouraging more visitors and lower your bounce rate.

5. Are the ads intrusive?

Blocking your own site’s content by showing me an advertisement is not gonna make me feel like staying any longer.

If the first thing that loads the moment I enter a web page is a pop up that I can’t even find the X button, I’d just give up on that page altogether. No matter how good your content is, if you insist on putting it behind a multitude of pop-ups then I’m sorry cause this is only making thing difficult for yourself.

If you really want a pop-up, an exit pop that only appears when the visitor is done with viewing your content and ready to leave the page is a passable option.

Oops, we ourselves is guilty of the exit pop technique.

The point is, intrusive ads makes a crappy user experience. If intrusive ads like this repeat on every single page of your website, it is only natural that visitors won’t stick around for long.

6. Have you disabled autoplay video and audio?

Automated video advertisement eat away bandwidth and is also annoying.

For the sake of a better user experience, I shall declare a ban on autoplay video and audio. Well, of course, I have zero power to decide on things like this. But how annoying it is to enter a web page only to be blasted with moving pixels that I couldn’t care less about.

I’m sure the majority of the internet users would agree with me. Users are appalled by these autoplay media. And you can be sure that they would not hesitate to exit your page the second they are being bombarded by these bandwidth suckers.

If you have to insert video and audio, make it so that they only play when prompted. For a better user experience and better bounce rate for yourself.

As much as I love the game, my bandwidth don’t really love it when you autoplay the trailer for me.

7. How long does it take to load your page?

Utilize PageSpeed Insights by Google to see how many seconds it takes to load your website on both mobile and desktop.

I’ve said it before and I will say it again, the rule of thumb is websites should load completely under 10 seconds. While half of the mobile visitors will abandon a website if it takes more than 3 seconds to load.

Images should be optimized. Both in terms of size and usage. Those banners at the sides of the page that showcase some third party ads are also slowing down your web page. Widgets, Javascript, plugins, do you really need them?

Are they benefiting your visitors? Are they benefiting your bounce rate? Use them only when necessary.

8. Does your title correlate with the content?

If you’re expecting a pastor stopping a fight between Trump and Clinton like a referee you’ll be disappointed.

Visitors are attracted to click on a page based on the title, and maybe 2 lines of description and a picture. Imagine clicking on a page titled “How to change a tire” but instead it talks about how to change engine oil. I would nope out of it in seconds.

You need to give them what you advertise. When the title says bounce rate I am talking about bounce rate, if I am talking about exit rate instead would you have stayed? Would you trust any of my other blog posts?

A more eye-catching, or keyword matching title might benefit you, but not if they don’t match the content itself. False advertisements will not bring your bounce rate anywhere.

9. Are you updating your content?

A 2019 marketing trends would be nice.

If you have a posts talking about the 2014 median size of web pages, why not update it to a 2018 version? Especially if you’re working in tech industries, an information that is correct a year ago may not be anymore.

Keep your content updated means keeping it relevant. Visitors are more likely to stick around knowing a website is updated and well maintained.

On the opposite, if the information on your website is outdated and irrelevant, it’s easy to see why visitors would not be interested to stay.

10. Do you have a story to tell about your brand?

A brand that is willing to share their story and value deserves at least some of my time.

Building trust and relationship and a budding community is the best thing a website or a brand can achieve. You don’t do that by flinging visitors with a sales pitch. They know sincerity when they see it.

Share the story of your brand, be sincere and genuine. Stories provoke emotions and emotions is what build loyalty.

If you want the visitors to engage with you, you should be frank with them and have their best intention in mind. Why would I stick around listening to a salesman trying to sell me stuff that I don’t want and don’t have to money to buy? It’s the same thing.

Tell them a story, tell them the value that you hold as a brand. Google might be getting rid of the don’t be evil motto, it doesn’t mean you have to hide yours.

Some behind the scene story really makes a company feels much more personal.

Yesoptimist is one of those websites whose blog posts I absolutely enjoy reading. Cause you can tell that they are honest and their backstory is engaging. Reading one blog post made me curious about their other posts. Which means they won’t be getting a 100% bounce rate from me.

#optin-template-3{
float: left;
margin: 0;
width: 100%;
max-width: 654px;
height: 100%;
}
#optin-template-3 .container{
float: left;
width: 100%;
height: 100%;
text-align: center;
background: #fff;
padding-bottom: 16px;
}
#optin-template-3 .top-row{
display: inline-block;
width: 88%;
padding: 3% 6% 0%;
}
#optin-template-3 .top-row h2{
margin: 5px 0 0;
font-family: “roboto”, helvetica, sans-serif;
color: #de5840;
font-weight: 600;
text-align: center;
padding:0px 0px 5px;
font-size:2.2em;
}
#optin-template-3 .left-column{
display: inline-block;
width: 100%;
max-width: 270px;
min-width: 270px;
height: 100%;
vertical-align: top;
padding-top: 32px;
}
#optin-template-3 .ebook-img{
width: 100%;
min-width:270px;
height: 280px;
background: url(https://seopressor.com/wp-content/uploads/2016/09/Content-Readers-Love.png);
background-size: contain;
background-repeat: no-repeat;
}
#optin-template-3 .right-column{
display: inline-block;
width: 60%;
min-width: 250px;
max-width: 305px;
padding: 24px 4% 32px;
}
#optin-template-3 .bodycopy ul{
text-align: left;
padding-left: 0;
}
#optin-template-3 .bodycopy ul li{
font-family: “roboto”, helvetica, sans-serif;
margin-left: 20px;
}
#optin-template-3 .optIn-form{
display: block;
bottom: 0;
}
#optin-template-3 .email{
display: block;
width: 100%;
border: 0;
padding: 8px 0;
font-size: 18px;
text-align: center;
border: 1px solid #de5840;
}
#optin-template-3 .submit-button{
display: block;
margin-top: 4%;
width: 100%;
padding: 8px 0;
font-family: “roboto”, helvetica, sans-serif;
font-weight: 400;
color: #fff;
background: #de5840;
font-size: 21px;
border: 0;
outline: 1px solid #de5840;
cursor: pointer;
}
#optin-template-3 .container .align-justify { text-align:justify !important;}

Checklist: Are You Providing What Your Readers Love Reading?

The ultimate checklist for every content marketer.
Learn what makes your readers tick
Get more new readers and traffic with this step-by-step checklist
20 tips included to help you create compelling content that readers love

The EU is Wrong, but Google is Still in Trouble

Posted by on Jul 16, 2018 in SEO Articles | Comments Off on The EU is Wrong, but Google is Still in Trouble

The EU is Wrong, but Google is Still in Trouble

I’ve found it tough to get my head around all the arguments in the recent EU judgement against Google. I find that writing helps me get my thoughts together and work out what I really think, so here goes. Let’s find out whether I agree with the ruling or not…

First – the background – you only really need to read two things to get the gist of the complaint:

The EU’s press release announcing the record EUR2.4 billion fine is surprisingly accessible and readable, and links out to a bunch of useful resources

Google’s response is relatively short and seemingly light on content, but actually frames the key points of the counter-argument well when you know what you’re looking for – see below!

Definitions

In a case like this that relies on complex technical subjects and also areas of the law (in this case competition law) with which most of us are not familiar, I think it’s useful to make sure we are all talking about the same things. In fact, this is probably my biggest criticism of the EU’s press release (not the judgement – just the communication). It would have benefited greatly from making sure we are all talking about the same things. Here are my best explanations of the key elements you need to understand:

Search (sometimes “general search”) – at its simplest, this is the process of typing a query, and receiving links to pages that satisfy your need. In practice, it has extended to other inputs (e.g. voice) and other outputs (e.g. answers, rich results etc.). We’ve written extensively about changes in the search market

Organic search – the results of searches that appear in an order determined by the search engine based on their quality and relevance as well as their likelihood of satisfying the searcher’s intent. Organic (or “natural”) search results are not advertising results, no money changes hands, and there is no way to pay for inclusion or for a better ranking

Paid search (also “Pay Per Click” or PPC) – adverts sold by Google and other search engines allowing advertisers to pay to appear next to search results based on the search word or phrase (and other variables)

Comparison shopping engines (CSEs) – websites where you can search for a product type, category or brand and then compare different products and/or different retailers – typically by sorting, filtering, and applying facets. The business model is typically either for retailers/brands to pay directly for inclusion, to pay for the clicks they receive, or for the engine to receive affiliate payouts when a searcher buys from the destination site

Google Shopping – Google’s own comparison shopping engine – where you can sort and filter products, apply facets, and click out to retailers’ sites – typically accessed via the “shopping” link at the top of a search results page

Product Listing Ads (PLAs) – a form of paid search whereby adverts for individual products (complete with rich information in the form of photos and prices) appear above, or to the side of the search results. Individual PLAs are presented in the same form as product links in Google Shopping, and the data comes from the same sources, but PLAs in the search results do not constitute a comparison shopping engine to my mind – there is no filtering and there are no facets

Want more advice like this in your inbox? Join the monthly newsletter.

//

A challenging definition – what markets are we talking about?

In Google’s response, they highlight Amazon and eBay as examples of sites that have grown during the time in question (despite Google’s alleged anti-competitive behaviour), and which offer much of the user benefit of comparison shopping engines. The user experience is clearly similar, while the key difference is that you can actually check out and buy on both sites. In some cases on Amazon, you are buying directly from Amazon – i.e. they are also the retailer – and in many cases on both eBay and Amazon, they are functioning as a marketplace so you enter your payment details on their site, but you are buying from a third party. This raises two difficult and related questions for me – questions which the EU has not answered to my satisfaction:

Must comparison shopping engines send users off to another site in order to purchase? If not, and Amazon and eBay are examples of comparison shopping engines, then I think the case is much harder to make – certainly some CSEs have fared poorly during the time period in question, but some (most notably Amazon) have thrived

If comparison shopping engines are narrowly defined, is it really a separate “market”? The EU’s case relies on a finding that Google is using its dominance in one market (general search) to crush competition in another market. I’m not even convinced that “product search” is actually a separate market to “general search” (I’m inclined to think users see everything they do from the main Google search box as one thing), but if you define comparison shopping so narrowly that it’s a different market to both “search” and “searchable marketplaces” (or whatever you label Amazon and eBay as) then the sub-divisions stop making sense to consumers in my opinion

Going even further down this rabbit-hole, I would love to have an expert (a competition lawyer?) explain to me how the markets are delineated and where these different features/businesses fall – all of which satisfy some of the same user intent:

General search

Product search

Product comparison engines (facets, filters, etc)

Visual search (e.g. Pinterest)

Product recommendation sites (e.g. The Wirecutter)

The easy points of agreement

I don’t think it is hard to make the case that Google meets the criteria to be considered a monopoly in “search” (almost any way you define it) in Europe. Google themselves make essentially no effort to rebut this, so let’s allow this strut of the EU’s argument.

Google has a comparison shopping engine – in the form of Google Shopping (previously Google Product Search and originally Froogle). You can see this in action by going to the “Shopping” tab at google.co.uk and searching for a product (the example in Google’s own post is [puma shoes]). You then get the opportunity to compare products by a range of metrics and facets, with the links going out to places to buy the individual products (retailers, manufacturers, brands).

The problems with the EU’s case

In addition to the definitional problem I highlighted above (and that Google presses on heavily in their response), I think there is another problem with the EU’s case in the specific market of shopping (note that the ultimate EU case is much broader and covers many other verticals – more on that below).

The EU’s ultimate finding is:

Google abused its market dominance as a search engine by promoting its own comparison shopping service in its search results, and demoting those of competitors — Commissioner Margrethe Vestager

I have issues with both parts of that:

Where does Google promote “its own comparison shopping service in its search results”?

You will not find Google Shopping pages ranking in Google’s organic search results. Nor will you find links to Google Shopping in paid search links. There is only one link to Google’s own CSE on any of their search results page – on shopping queries it is here:

And on other searches it is in the “more” menu:

Now, it’s possible that even this level of cross-promotion (simply linking to the Google Shopping product from their menu) is too much, and maybe they should remove it (see below) but this is too small to warrant the huge fine in my opinion. (There is actually one more way of getting there – if you click the right-arrow by the PLAs multiple times to scroll through all the products on offer, you eventually get to a link that takes you to the Google Shopping results page. I’m willing to bet that the percentage of people who actually do this is miniscule and Google could remove it with essentially no impact on Google Shopping).

I believe the EU has a problem with the individual products listed at the top of the search results in the first image above – but that is very clearly not comparison shopping functionality (it is very similar to the links to individual products in the organic results below) – it’s simply what paid search looks like on commercial product queries. It is also very clearly not links to Google’s own CSE – those links go out to retailers’ sites – not to Google Shopping. The data comes from the same source – that is all.

“…abused its dominance…by demoting…competitors”

As I pointed out above, Google Shopping does not appear anywhere in the organic search results. To the extent that they treat Google Shopping differently to a third party CSE, they treat it worse – ciao.co.uk (one of the complainants does appear somewhere in the organic results, while Google Shopping appears nowhere). [In reality, they actually treat it identically – if Ciao were to block Google’s crawlers the way Google Shopping does, they would also rank nowhere in the organic results].

Given that none of the organic results are Google properties for any of these product queries, any “demoting” of a specific competitor involves the promoting of another. For every Ciao that loses rankings, there must be a retailer, marketplace, or other CSE that gains rankings.

This argument is problematic in other verticals – where there are links to Google properties in the search results, and those links do push down links to competitors – but it holds strong in comparison shopping as far as I can see.

The competitors are objectively poor in comparison shopping

If you rule marketplaces and retailers like eBay and Amazon out of the “comparison shopping engine” market, then the general quality of these sites is low. The complainants in particular – foundem.co.uk and ciao.co.uk are both very much worse user experiences than either regular Google search or Google Shopping. This SearchEngineLand article breaks Foundem down nicely, while Ciao is still online and you can go and see for yourself:

Slow loading, intrusive irrelevant banner advertising, broken links, missing images, and irrelevant reviews:

“Advantages: Great price, good quality, useful pockets” — review on the Puma shoes results page (emphasis mine).

I know that EU competition law focuses on the impact on competitors rather than the impact on consumers as US competition law does, but we should also step back a moment and look at the fact that these sites complaining about unfair treatment are objectively worse than Google’s offering.

[Note that this is not the case in other verticals – travel and financial services, in particular, are sticky areas for Google – see below.]

A ballsy response from Google

I hesitate to say that this would be my recommended course of action if I were advising Google, but a direction I would love to see them take is as follows:

Leave PLAs as they are – if comparison shopping is a separate market to “general search” in which Google has a monopoly, then PLAs definitely fall in the general search part rather than the comparison shopping part. They are integrated into the results a searcher receives when they perform a search that starts at the Google homepage, and there is no comparison functionality – it simply links to products

Remove the shopping link in the top menu – this is the one area I can see that they have favoured their comparison shopping engine (Google Shopping) over others (e.g. ciao.co.uk – one of the complainants) who cannot get their homepage linked from the top menu

Open up Google Shopping pages to their own search index – i.e. enable pages like the result you find when you search [Puma shoes] on the Google Shopping tab to be indexed and appear in the regular organic search results (to be clear, this does not happen at the moment – Google keeps these pages explicitly out of the main search index). Doing this will increase competition in the general search results for the complainants, but it clarifies that Google is treating their comparison shopping engine (Google Shopping) exactly on a level playing field with competitors such as ciao.co.uk and paves the way for them to treat (all) comparison shopping engines as harshly as they like in regular search [I’m not the first to think of this – see Danny Sullivan’s excellent article from the beginning of this case]

The reality: this is really bad for Google

The EU has started with what I think is the weakest of the verticals, and I think there are strong arguments that Google has not abused their market power in the specific ways this case claims.

But the EU has ruled against Google. In the weakest case against them there is.

The EU has shown here that they are prepared to take action to defend businesses offering worse user experiences against integrated changes Google makes to their core search engine. Shopping is an arguable case, but there are many more verticals where this precedent opens the way for future large fines:

Travel (especially flight search)

Maps / local

Jobs

Financial products

Images

Video

Apps

News

If the EU insists on treating each of these verticals the same way they have treated shopping, then Google is facing many huge fines – quite aside from the AdSense and Android cases (each of which could be a big deal in its own right). If the EU is prepared to take a lower-quality competitor in each case and say competitiveness has been harmed by Google’s inclusion of a great user experience directly in the general search results, then each of these is at least as egregious as the comparison shopping case.

Even worse for Google, some of these other competitors are not lower-quality. In travel and financial products, in particular, there are some spectacularly good sites offering great UX. The defence in those areas will not be as easy for Google.

What does real regulation of Google look like?

The case for some increased regulation of Google (and other tech giants) is growing. I recently wrote another article breaking down my objections to a New York Times article calling for a break-up.

I certainly don’t have all the answers, but I did get drawn into a bit of speculation about what I thought effective regulation could look like in the comments on that post. To be clear, I don’t expect to see an unbundling, but I was interested to see the same thought experiment applied to Amazon the other day.

For more on the subject, I recommend this week’s public Stratechery article and the other articles linked from it. Stratechery remains my top paywalled recommendation – easily worth the $100 / year in my opinion.

Why You Need to be Building for Intelligent Personal Assistants

Posted by on Jul 16, 2018 in SEO Articles | Comments Off on Why You Need to be Building for Intelligent Personal Assistants

Why You Need to be Building for Intelligent Personal Assistants

What makes personal assistants interesting is that Alphabet, Amazon, Microsoft, and Apple are vulnerable and need you to get on board.

Apparently seven years ago Jeff Bezos, inspired by a love of Star Trek, decided that Amazon should build a something that you can talk to, and which would turn your commands into actions. Now Amazon Alexa is competing with a growing number of intelligent personal assistants[1] from almost every corporate behemoth around. Alphabet (the company that owns Google) has Google Home, Microsoft has Cortana, and Apple, of course, has Siri which was perhaps the best known early on. Viv and Facebook M are also interesting contenders but the state of play for each of those is different enough that we’ll cover them at the end. If you ask any of these companies about investment in this endeavour they’ll say that digital assistants are the next big thing.

While each of the big four has their physical product, the battle isn’t between Google Home and Echo Dot – what they need to succeed is their operating system. Microsoft is in this race by merit of owning the first most popular way we interact with computers by mouse. Owning the most popular assistant OS could help any of these companies define the next decade.

Regardless of the millions of pounds and work hours that each of these organisations has put into their respective offerings, one thing they have in common is; it’s not enough.  Not yet, not for the lofty goal of having a program that can understand anything you say and do just what you want it to.

That’s not to say that these programs aren’t incredibly impressive technological forward steps. However, reviews like this business insider comparison and this one from CNN make it pretty clear that asking too much of these programs will quickly reveal how much they cannot do.

The problem is that we’ve come to expect a lot from computers. I would be personally outraged if my mobile phone refused to update my social media, download video, send and receive emails from multiple accounts, display every photo I took over the past few years (regardless of what device I took it on) and tell me the top speed of a grizzly bear[2]. This functionality has become synonymous with the device but mobile phone manufacturers are responsible for a relatively small proportion of that. We’re used to platforms that use years of established protocols to support all kinds of software. Now, every company trying to build the world’s AI is coming up against two core problems:

They are building a brand new breed of platform. Intelligent assistants are different enough from existing operating systems that they need to put in a lot more work rebuilding existing connections

Previously there was a certain amount of leeway for programs to be pedantic. Up to a certain extent, we accept that it’s our fault for not pushing the right buttons. We don’t have the same patience when speaking so these programs have to be able to respond to pretty much anything a person might say.

Apple once described the iPhone 6’s multi-pressure touch as “trying to read minds” but really, technology has been about reading trends and teaching minds – a series of incremental tweaks with the onus on us as consumers to adapt. The challenge here is to recreate decades of program integrations and, as a small side project, codify the entire spoken human language.

It can’t be done. Certainly not by one team and, let’s face it, if you were racing Apple to build Weird Science would you bet on yourself to do it alone?

What makes personal assistants interesting is that Alphabet, Amazon, Microsoft, and Apple are vulnerable and need you to get on board. They have bet a lot on this, and none of them wants to be Betamax. Or Zune.

Get the very best content from Distilled in your inbox every month

//

Your chance

There’s almost no graphic design involved and competition is far lower than you’ll find in any of their respective app stores.

This is where you come in. In order for any one of these companies to win this race, they need individuals and companies to develop a lot of the programs, or at least the program-specific integrations, for them.

Your choice now is whether you invest the time to get a stake in the ground, knowing that most will welcome it but that you’re also betting on their success.

Compared to designing a standard app, the time and training investment for many simple functions is hugely reduced.  There’s no design involved and competition is far lower than you’ll find in any of the respective app stores. As a proof of concept, with no coding knowledge pre-February, I’m building an interactive program that could integrate with a bunch of messaging platforms as well as Google Home and Alexa (more on that later).

Amazon are offering free bootcamps to learn more about building Alexa skills. Image source: Guillermo Fernandes via Flickr

The companies at play are also far more open here than in other arenas. Amazon is running free half day bootcamps to teach the principles of building Alexa skills, and are giving out a plethora of prizes and incentives for successful attempts. Alphabet is offering to suggest you if a user asks for something that your program could fulfil – the kind of relevant, single-result search ownership that companies would kill for in a browser. Companies that are taking advantage of these platforms are already reaping the rewards, for instance, the JustEat skill has been preinstalled on Amazon Echo from the first shipment thanks to their chatbot strategy – a huge advantage over competitor programs which users will manually have to download.  What’s more, a lot of these new eco-systems use engagement metrics as a way of ranking programs, so by starting now and building up those numbers before competitors cotton on, companies can vastly improve their chances when things get far more crowded.

How to build a chatbot

Unsurprisingly, the biggest change you need to make to capitalise on AI is replacing button clicks with phrases. Each of the big four has started advocating platforms that take the burden of recognising a sentence (spoken or written), breaking it up, and sending you the important information in digestible chunks. You just have to tell them what is important and when (I’ve included a list of these platforms at the bottom of this post).

By and large, the following, intentionally broad instructions, will serve you in creating a conversational application on any these platforms as they all have a few things in common. This will give you an idea of the way you need to think about interacting with them. In the coming months, I’ll be writing a more in-depth post about how I created my bot using the api.ai platform, which Alphabet acquired last year.

Plan your interactions

This will be easier once you’ve got a feel for the platform but you almost need a flowchart for the conversation with markers for times when your program is doing things behind the scenes.  David Low, Product Evangelist for Amazon, says that the lowest rated apps often have too many options. He recommends starting very small and adding options later.

Always plan your interactions to get an idea how conversations will play out.

Decide what you want people to call your program

This is the part of the process that is the most ‘SEO’ and applies most specifically to spoken interactions. Essentially this is what people need to say to wake up your program. Think “OK Google I want to talk to Superdry Online” or “Alexa, ask Dominoes to order me a 12-inch pizza”. It’s a bit clumsier than might be ideal but it means you know what you’re getting, rather than accidentally posting your Spotify password on Facebook.

Usually, once you publish your program it’s too late to change your invocation so you need to think in advance about something short, memorable, and descriptive. It helps if your brand name already ticks those boxes but you’re likely to run into problems if you have a web2.0 name like ‘Pinkr’ or ‘seetbk’. The platforms are prone to confusing homophones and you may need to get in touch with the companies directly to overcome that confusion. The fact that they are willing to work with individual brands to manage proper brand recognition is one sign of the opportunity at this point.

Create the phrases you want your program to respond to and highlight the variable information

On all of these platforms you create phrases with parts that won’t change, then you can also add parts that will change. For instance, the phrase “My name is Slim Shady” is of the format “My name is {name}”. This means that you can handle the heavy lifting of variations in speech using these platforms and it takes a load of the burden off of any external code.

Deal with the JSON it sends you

First things first – there are scenarios where you won’t need to code at all, it just limits you in what your bot can do. I created the simple back and forth you can see in this gif in about ten minutes using no external code. If you have coding experience or are comfortable with learning, you can integrate pretty much any of these services if you can securely receive and respond to a JSON POST request within about 5-8 seconds.

Test and go live

Most of the services offer some kind of easy integration out of the box. They’ll often walk you through it and, if all you need is a relatively standard setup, this will probably take you all of twenty minutes.

You’ll then usually need to go through a slightly separate process to actually publish, mainly medium-specific quality checks.

Fortunately, platforms like api.ai and converse.ai allow integration to multiple mediums at a time. So, having built for Google Home, you can roll out to Facebook, Slack, Telegram etc. with relatively little overhead.

The next five years

If you can only build for one platform and you’re trying to prioritise, you can’t go terribly far wrong. Microsoft’s linguistic processing platform, LUIS, is integrated with the popular Microsoft Bot framework which has almost tripled developer usage in the last six months and stretches far further than Cortana. This is the framework that JustEat and Three are using to build across multiple mediums, including website integrations. It’s worth noting that consumer usage figures for Cortana may be heavily inflated depending on whether Microsoft is including any use of the Windows 10 search bar, however, they are also using those search bar inputs to perfect their back end machine learning platform which should help improve accuracy across all applications.

Alphabet’s recommended platform – API.AI is easy to pick up and can launch on a number of mainstream chat mediums with just a few clicks. Alphabet can also rely heavily on their Google search engine to help make their assistants more full-functioned and user attractive from the off. Unlike with Alexa, users don’t need to manually select your bot to be installed on their device, this helps users access your service but means that individual requests become more like web searches, rather than using a specifically chosen app. Instead of competing once for install you’re competing every time a user says “Hey Google” and getting in early to be the program that Google Assistant suggests will be a huge win.

Apple seems to be the furthest behind with their developer kit, SiriKit, being pretty much limited to things that Siri can already do. That being said, Apple’s dominance in smartphone hardware and OS is a strong foothold. Apple’s laser focus on their own ecosystem could hamper long term plans to be everyone’s HAL 9000, but in the short term, people who have committed to Apple’s vision are already the closest casual consumers to having an omniscient machine that follows you from room to room.

Apple’s focus on its own ecosystem could cause it to lose out in the personal assistants arms race. Image source: Kārlis Dambrāns via Flickr.

Facebook M, Facebook’s intelligent personal assistant is an interesting departure from the norm. Rather than trying to create a program that can do everything, Facebook’s offering is more like partial automation. Facebook M is designed to deal with as many queries as possible like the other IPAs but, when it gets stuck, send the request on to human customer service reps that go as far as calling the DMV. The idea is that everything these reps do is recorded, so that Facebook M can eventually do it alone. While this is currently only available to limited geographies, and could run into some serious scalability issues, Facebook M has the potential to deliver the customer experience they’re all striving for within far shorter timelines.

Viv is another IPA worth of mention at this point. Viv was created by the team which originally built Siri. In a launch video, Co-Founder, Dag Kittlaus, explains that Viv receives a request, checks all the integrations it has at its disposal, and then writes the code it needs to fulfil the request itself. While their developer centre isn’t yet open to wider use, you can email them about a partnership and this different setup should mean the platform is far easier to build services for.

For my money, Amazon is making the most interesting strategic decisions. They are actively courting programmers and brands and are expressly separating Alexa, the program, from the Echo devices that run it. Amazon’s laissez-faire attitude to Alexa uses meant that CES 2017 included Alexa on devices from cars and washing machines to direct Echo competitors.  They’ve even managed to sneak Alexa onto iPhones by adding it as a feature to the Amazon app, which many users already have installed. This can’t compete with the ease of summoning Siri at just the hold of a button but it’s a shot across the bow for Apple’s own assistant. It’s particularly interesting that Amazon has said they think digital assistants should be able to use each other – a nice ideal and fantastic way to break out of platform silos if one service is to become dominant.

Chances are that all of these players are too big to be stamped out of the race entirely but if one of them can reach critical mass of developers and users to become the defacto disembodied voice, that is going to become very interesting indeed. And particularly valuable for those businesses that have the foresight or agility to keep up.

Resources
Platform specific resources

Microsoft is pushing LUIS in conjunction with the Microsoft Bot Framework, Google have invested in api.ai, and Amazon recommends building Alexa skills using the purpose-built section in developer.amazon.com. Amazon is also offering free (up to a point) hosting for your external code on aws.amazon.com – the downside is that the Amazon platform is a bit more dependent on code but they make linking into your code easier. Apple gives information about SiriKit, their SDK built specifically for Siri, here.

Sample JSON from API.AI

This isn’t identical to the messages that all of the platforms will send, but it’s the kind of thing you can expect:

{ “id”: “9962fb04-3808-472e-9fe0-f34de1f029b7”,
“timestamp”: “2017-06-26T17:27:48.156Z”,
“lang”: “en”,
“result”: {
“source”: “agent”,
“resolvedQuery”: “My name is Slim Shady”,
“action”: “”,
“actionIncomplete”: false,
“parameters”: {“name”: “Slim Shady”},
“contexts”: [],
“metadata”: {
“intentId”: “2c7ba931-5ea7-4693-b384-eea23a661c68”,
“webhookUsed”: “false”,
“webhookForSlotFillingUsed”: “false”,
“intentName”: “My name is name”},
“fulfillment”: {
“speech”: “”,
“messages”: [{
“type”: 0,
“speech”: “”}] },
“score”: 1},
“status”: { “code”: 200,”errorType”: “success”},
“sessionId”: “1b0e0d9a-0efb-4d48-9dfc-9a1d5ebf1364”}

[1] Or interactive personal assistants, or Digital assistants, or AI, or bots, or any of the other host of names that have sprung up.

[2] In case you’re interested, apparently it is almost 35mph according to speedofanimals.com although I have not one clue what the “feels like” column means.

 

SEO for JavaScript-powered websites (Google IO 18 summary)

Posted by on Jul 16, 2018 in SEO Articles | Comments Off on SEO for JavaScript-powered websites (Google IO 18 summary)

SEO for JavaScript-powered websites (Google IO 18 summary)

SEO for JavaScript-powered websites (Google IO 18 summary)

You have probably heard that in the recent Google I/O 18, Google shed some light on SEO.

Tom Greenaway and John Muller of Google presented a session about making your modern JavaScript powered websites search friendly.

They actually listed some recommended best practices, useful tools, and Google policy change.

Here’s the thing:

In a pretty un-Google like way, the duo also shed some light on how the actual crawl and index process for javascript websites work.

Check out the video here:

But, if you don’t want to spend 40 minutes watching the recording.

Hang around, cause here’s a quick summary of the important key points of the session.

A brief background introduction on the presenters…

Tom Greenaway is a senior developer advocate from Australia. While John Mueller (aka johnmu, ring a bell?), is Google’s webmaster trends analyst from Zurich, Switzerland.

How does crawl, render and index works for JavaScript powered websites?

Tom started the talk by sharing a little background of search engines.

Here’s the deal,

The purpose of search engines is to provide a relevant list to answer user’s queries. A library of web pages is compiled where answers are pulled from.

That library is the index.

Building an index starts with a crawlable URL.

Now, the crawler is designed to find contents to crawl.

But, in order to do this, the content must be retrievable via an URL. When the crawler gets to an URL, it will look through the HTML to index the page as well as find new links to crawl.

Here’s a diagram on how search works for Google.

So how do you make sure that your content is reachable for the Googlebot?

Here’s what you need to know, Tom shared the six steps to ensure your web page will be indexed.

1. Make sure that your URL is crawlable
– Set up robots.txt at the top level domain of your site. Robots.txt is useful to let Googlebot know which URLs to crawl and which to ignore.

2.Utilize canonical tags
– In case of content syndication where a content is distributed on different sites to maximize exposure. The source document should be tagged as the canonical document.

3. Make sure the URL is clean and unique
– Don’t list session information on the URL.

4.Provide a sitemap to Googlebot
– That way the crawler has a list of URLs to crawl and you can sleep better at night knowing your website is properly crawled.

5. Use history API
– It replaces the hashbang tag(#!), which, if used will no longer be indexed.

6. Make sure your links have anchor tags with HREF attributes
– Googlebot only recognizes links with BOTH anchor tags and HREF attributes, otherwise, they won’t be crawled therefore never indexed.

What’s more important is,

Tom said Google has been encountering a list of problems trying to crawl and index websites that are built using Javascript.

Here’s the list of most commonly face problem for javascript website indexing

Make sure to have a good look at it, don’t wanna be repeating these same mistakes.

1. HTML delivered from the server is devoid of any content…
– Which leads Googlebot to assume that there’s nothing to index.

2. Lazy loading images are only sometimes indexable
– Make sure that they are properly indexed,use noscript tag or structured data.
– Take caution, images only referenced through CSS are not indexed.

3. Any contents that are triggered via an interaction won’t be indexed
-Googlebot is not an interactive bot, which means he won’t go around clicking tabs on your website. Make sure the bot can get to all your stuff by either preloading the content or CSS toggling visibility on and off.
– What’s better, just use separate URLs to navigate user and Googlebot to those pages individually.

4. Rendering timeout
– Make sure your page is efficient and performant by limiting the number of embedded resources and avoid artificial delays such as time interstitials.

5. API that store local information is not supported.
– What happens instead is that Googlebot crawls and renders your page in a stateless way.

Now, due to the increasingly widespread use of JavaScript, there is another step added between crawling and indexing. That is rendering.

Rendering is the construction of the HTML itself.

Like mentioned before, the crawler needs to sift through your HTML in order to index your page. JavaScript-powered websites need to be rendered before it can be indexed.

According to Tom and John, Googlebot is already rendering your JavaScript websites.

What we can make out of the rendering process and indexing process for a JavaScript website is as below.

1. Googlebot uses the Chrome 41 browser for rendering
-Chrome 41 is from 2015 and any API added after Chrome 41 is not supported.

2. Rendering of JavaScript Websites in Search is deferred
– Rendering web pages is a resource heavy process, therefore rendering might be delayed for a few days until Google has free resources.

3. Two-phase indexing
– First indexing happens before the rendering process is complete. After final render arrives there will be a second indexing.
– The second indexing doesn’t check for canonical tag so the initially rendered version needs to include the canonical link, or else Googlebot will miss it altogether.
– Due to the nature of two-phase indexing, the indexability, metadata, canonical tags and HTTP codes of your web pages could be affected.

John Mueller takes the baton and shares with us some basic information on rendering.

What’s important is, he shared with the crowd which is the preferred rendering method of Google.

Client side, server side, hybrid and dynamic rendering.

1. Client side rendering
– This is the traditional state where the rendering happens on the browser of users or on a search engine.

2. Server side rendering
– Your server deals with the rendering and serve users and search engine alike static HTML.

3. Hybrid rendering (the long-term recommendation)
– Pre-rendered HTML is sent to users and search engine. Then, the server adds JavaScript on top of that. For the search engine, they will simply pick up the pre-rendered HTML content.

4. Dynamic rendering (the policy change & Google’s preferred way)
– This method sends client side rendered contents to users while search engines got server side rendered content.
– This works in the way that your site dynamically detects whether its a search engine crawler request.
– Device focused contents need to be served accordingly (desktop version for the desktop crawler and mobile version for the mobile crawler).

How hybrid rendering works.

Now that it is out in the open that Google prefers the (NEW) dynamic rendering method to help the crawling, rendering and indexing of your site. John also gives a few suggestions on how to implement dynamic rendering.

Ways to implement dynamic rendering

1. Puppeteer
– A Node.js library, which uses a headless version of Google Chrome that allows you to render pages on your own server.

2. Rendertron
– Could be run as a software or a service that renders and caches your content on your side.

Both of these are open source projects where customization is abundant.

John also advises that rendering is resource extensive, so do it out of band from your normal web server and implement caching where needed.

The most important key point of dynamic rendering is this,

it has the ability to recognize a search engine request from a normal user request.

But how could you recognize a Googlebot request?

The first way is to find Googlebot in the user-agent string.
The second way is to do a reverse DNS lookup.

John stresses during the session that implementing the suggested rendering methods is not a requirement for indexing.

What it does, is it makes the process crawling and indexing process easier for Googlebot.

Considering the resource needed to run server side rendering, you might want to consider the toll before implementing.

So when do you need to have dynamic rendering?

Here’s what,

When you have a large and constantly updated website like a news portal because you want to be indexed quickly and correctly.

Or, when you’re relying on a lot of modern JavaScript functionality that is not supported by Chrome 41, which means Googlebot won’t be able to render them correctly.

And finally, if your site relies on social media or chat applications that require access to your page’s content.

Now let’s look at when you don’t need to use dynamic rendering.

The answer is simple,

if Googlebot can index your pages correctly, you don’t need to implement anything.

So how can you know whether Googlebot is doing their job correctly?

You can employ a progressive checking.

Keep in mind that you don’t need to run tests on every single web pages. Test perhaps two each from a template, just to make sure they are working fine.

So here’s how to check whether your pages are indexed

1. Fetch as Google on Google Search Console after verifying ownership, this will show you the HTTP response before any rendering as received by Googlebot.

2. Run a Google Mobile Friendly Test.

Why?

Because of the mobile-first indexing that is being rolled out by Google where mobile pages will be the primary focus of indexing. If the pages render well in the test, it means Googlebot can render your page for Search

3. Keep an eye out for the new function in the mobile friendly test. It shows you the Googlebot rendered version and full information on landing issue in case it doesn’t render properly.

4. You can always check the developer console when your page fails in a browser. In developer console, you can access the console log when Googlebot tries to render something. Which allows you to check for a bunch of issues.

5. All the diagnostics can also be run in the rich results test for desktop version sites.

At the end of the session, John also mentions some changes that will happen.

The first happy news,

Google will be moving rendering closer to crawling and indexing.

Which we can safely assume that it will mean that the second indexing will happen much quicker than before.

The second happy news,

Google will make Googlebot use a more modern version of Chrome. Which means a wider support of APIs.

They do make it clear that these changes will not happen until at least the end of the year.

To make things easier, here are the four steps to make sure your JavaScript-powered website is search friendly.

With that, the session is concluded. Do check out our slide show for a quick refresh.

All in all, Google is taking the mic and telling you exactly what they want.

Better take some note.

Delivering search friendly java script-powered websites (Google io 18 summary) from Jia Thong Lo

#optin-template-3{
float: left;
margin: 0;
width: 100%;
max-width: 654px;
height: 100%;
}
#optin-template-3 .container{
float: left;
width: 100%;
height: 100%;
text-align: center;
background: #fff;
padding-bottom: 16px;
}
#optin-template-3 .top-row{
display: inline-block;
width: 88%;
padding: 3% 6% 0%;
}
#optin-template-3 .top-row h2{
margin: 5px 0 0;
font-family: “roboto”, helvetica, sans-serif;
color: #FFAB40;
font-weight: 600;
text-align: center;
padding:0px 0px 5px;
font-size:2.2em;
}
#optin-template-3 .left-column{
display: inline-block;
width: 100%;
max-width: 270px;
min-width: 270px;
height: 100%;
vertical-align: top;
padding-top: 32px;
}
#optin-template-3 .ebook-img{
width: 100%;
min-width:270px;
height: 280px;
background: url(https://seopressor.com/wp-content/uploads/2016/09/SEO-Tools.png);
background-size: contain;
background-repeat: no-repeat;
}
#optin-template-3 .right-column{
display: inline-block;
width: 60%;
min-width: 250px;
max-width: 305px;
padding: 24px 4% 32px;
}
#optin-template-3 .bodycopy ul{
text-align: left;
padding-left: 0;
}
#optin-template-3 .bodycopy ul li{
font-family: “roboto”, helvetica, sans-serif;
margin-left: 20px;
}
#optin-template-3 .optIn-form{
display: block;
bottom: 0;
}
#optin-template-3 .email{
display: block;
width: 100%;
border: 0;
padding: 8px 0;
font-size: 18px;
text-align: center;
border: 1px solid #FFAB40;
}
#optin-template-3 .submit-button{
display: block;
margin-top: 4%;
width: 100%;
padding: 8px 0;
font-family: “roboto”, helvetica, sans-serif;
font-weight: 400;
color: #fff;
background: #FFAB40;
font-size: 21px;
border: 0;
outline: 1px solid #FFAB40;
cursor: pointer;
}
#optin-template-3 .container .align-justify { text-align:justify !important;}

18 Essential SEO Tools To Optimize Your Website

An up-to-date list of SEO tools for every marketer to optimize your website.
Identify 18 practical tools that save your time to optimize manually
Get more traffic and higher ranking with these tools
Discover the benefits of every tool to help strengthen your SEO strategy

The Hierarchy of Evidence for Digital Marketing Testing

Posted by on Jul 15, 2018 in SEO Articles | Comments Off on The Hierarchy of Evidence for Digital Marketing Testing

The Hierarchy of Evidence for Digital Marketing Testing

In the two-and-a-bit years that I’ve been working in Digital marketing, I’ve been keen to understand the reasons why we make the decisions we do in digital marketing. There’s a wide variety of ways to approach a problem, and although many of them have value, there has to be a best way to make sure that the decision you make is the right one. I wanted to take my evidence-first way of thinking and apply it to this field.

In a previous life, I worked in clinical science, specifically in the field of medical physics. As part of this I was involved in planning and carrying out clinical trials, and came across the concept of a ‘hierarchy of evidence’. In clinical research, this refers to the different standards of evidence that can be used to support a claim – be that a new drug, piece of technology, surgical technique or any other intervention that is claimed to have a beneficial effect – and how they are ranked in terms of strength. There are many different formulations of this hierarchy, but a simple version can be seen here:

According to this ordering, a systematic review is the best type of evidence. This involves taking a look across all of the evidence provided in clinical trials, and negates the effects of cherry-picking – the practice of using only data that supports your claim, and ignoring negative or neutral results. With a systematic review we can be sure that all of the evidence available is being represented. A randomised controlled trial is a method of removing any extraneous factors from your test, and of making sure that the effect you’re measuring is only due to the intervention you’re making.

This is opposed to case-control reports, which involve looking at historical data of two populations (e.g. people who took one drug vs. another) and seeing what their outcomes were. This has its uses when it is not possible to carry out a proper trial, but it is vulnerable to correlations being misidentified as causation. For example, patients who were prescribed a certain course of treatment may happen to live in more affluent areas and therefore have hundreds of other factors causing them to have better outcomes (better education, nutrition, less other health problems etc.).

All of these types of tests should be viewed as more authoritative than the opinion of anyone, regardless of how experienced or qualified they are. Often bad practices and ideas are carried on without being re-examined for a long time, and the only way we can be sure that something works is to test it. I believe that this is also true in my new field.

A hierarchy of evidence for digital marketing

While working at Distilled, I’ve been thinking about how I can apply my evidence-focussed mindset to my new role in digital marketing. I came up with the idea for a hierarchy of evidence for digital marketing that could be applied across all areas. My version looks like this:

A few caveats before I start: this pyramid is by no means comprehensive – there are countless shades of grey between each level, and sometimes something that I’ve put near the bottom will be a better solution for your problem than something at the top.

I’ll start at the bottom and work my way up from worst to best standards of evidence.

Hunches

Obviously, the weakest form of evidence you can use to base any decision on is no evidence at all. That’s what a hunch is – a feeling that may or may not be based on past experience, or just what ‘feels right’. But in my opinion as a cold-hearted scientist, evidence nearly always trumps feelings. Especially when it comes to making good decisions.

Having said that, anyone can fall into the trap of trusting hunches even when better evidence is available.

Best practice

It’s easy to find brilliant advice on the ‘best practice’ for any given intervention in digital marketing. A lot of it is brilliant advice (for example DistilledU) but that does not mean that it is enough. No matter how good best practice advice is, it will never compare to evidence tailored to your specific situation and application. Best practice is applicable to everything, but perfect for nothing.

Best practice is nevertheless a good option when you don’t have the time or resources to perform thorough tests yourself, and it plays a very important role when deciding what direction to push tests in.

Anecdotal evidence

A common mistake in all walks of life is thinking that just because something worked once before, it will work all of the time. This is generally not true – the most important thing is always data, not anecdotes. It’s especially important not to assume that a method that worked once will work again in this field, as we know things are always changing, and every case is wildly different.

As with the above example of best practice advice, anecdotal evidence can be useful when it informs the experimentation you do in the future, but it should not be relied on on its own.

Uncontrolled/badly controlled tests

You’ve decided what intervention you want to make, you’ve enacted it and you’ve measured the results. This sounds like exactly the sort of thing you should be doing, doesn’t it? But you’ve forgotten one key thing – controls! You need something to compare against, to make sure that the changes you’re seeing after your intervention are not due to random chance, or some other change outside of your control that you haven’t accounted for. This is where you need to remember that correlation is not causation!

Almost as bad as not controlling at all is designing your experiment badly, such that your control is meaningless. For example, a sporting goods ecommerce site may make a change to half the pages on its site, and measure the effect on transactions. If the change is made on the ‘cricket’ category just before the cricket season starts, and is compared against the ‘football’ category, you might see a boost in sales for ‘cricket’ which is irrelevant to the changes you made. This is why, when possible, the pages that are changed should be selected randomly, to minimise the effect of biases.

Randomised controlled trials (A/B testing)

The gold standard for almost any field where it’s possible is a randomised controlled trial (RCT). This is true in medicine, and it’s definitely true in digital marketing as well, where they’re generally referred to as A/B tests. This does not mean that RCTs are without flaws, and it is important to set up your trial right to negate any biases that might creep in. It is also vital to understand the statistics involved here. My colleague Tom has written on this recently, and I highly recommend reading his blog post if you’re interested in the technical details.

A/B testing has been used extensively in CRO, paid media and email marketing for a long time, but it has the potential to be extremely valuable in almost any area you can think of. In the last couple of years, we’ve been  putting this into practice with SEO, via our DistilledODN tool. It’s incredibly rewarding to walk the walk as well as talking the talk with respect to split testing, and being able to prove for certain that what we’re recommending is the right thing to do for a client.

Sign up to find out more about our new ODN platform, for a scientific approach to SEO.

//

The reality of testing

Even with a split test that has been set up perfectly, it is still possible to make mistakes. A test can only show you results for things you’re testing for: if you don’t come up with a good intervention to test, you won’t see incredible results. Also, it’s important not to read too much into your results. Once you’ve found something that works brilliantly in your test, don’t assume it will work forever, as things are bound to change soon. The only solution is to test as often as possible.

If you want to know more about the standards of evidence in clinical research and why they’re important, I highly recommend the book Bad Science by Ben Goldacre. If you have anything to add, please do weigh in below the line!

5 Google SERP Features That Can Give You Tons of Traffic

Posted by on Jul 15, 2018 in SEO Articles | Comments Off on 5 Google SERP Features That Can Give You Tons of Traffic

5 Google SERP Features That Can Give You Tons of Traffic

With how dynamic the SERP results (Search Engine Result Page) has become, there’s actually much more than being ranked first. Now, there are more ways to appear on the Google SERP other than being a blue link. Especially more so on mobile.

So today I’m sharing with you a list of SERP features that you should target on. Five plus one extra.

What are SERP features?

Other than the standard blue links, Google is offering us the users so much more.

You can type in square root of 48 in the search bar. You will get a calculator with the answer on it. Which is pretty cool and convenient. When I asked Google how old is Obama, other than an instant answer, there is also a list of prominent figures related to Obama with their ages attached.

Google whips up a calculator to answer your math question.

What this means is, Google is getting better and better at giving the users what they want. And they know a list of blue links on a white backdrop is not going to cut out any longer. That’s why they introduced and constantly updates a bunch of SERP features.

Now, you might think why does it concern you? Well because you can actually take advantage of some of those unique SERP features and give yourself more exposure. Therefore, gaining more traffic.

Well I didn’t asked for Trump, Clinton or Michelle Obama’s age. But it’s nice to know anyways, thanks Google!

Fighting for the page one rank is proving to be getting much harder than before. That’s why we’re gonna fight smart.

Let’s get started with the 5 SERP features that can give you tons of traffic.
1. GOOGLE ADWORDS

This is a no-brainer. Wanna get listed on page one, even higher than the number one ranked organic result? Well, Google got your back, provided you pay them some money (oops).

But if you think placing ads is as easy as paying and watching traffic trickling in. Think again. Planning an AdWords campaign is actually an intense game of keyword researching.

Just like trying to rank for your page, you need to identify the point keywords. Or even better, niche keywords that will get you dedicated and relevant visitors.

There are two types of keywords that you can target. Short tail keywords may have more search volume, but they are also highly competitive and expensive. While long tail keywords need you to do more in-depth research and they might not drive that much traffic. But what you are going to get are dedicated visitors.

You might think, is it worth it to pay that much money for ads? Well, ads have been around for decades and it is not going away. I would recommend fresh business that needs some exposure to use ads.

Content and inbound marketing are great. But when you can’t afford the “slow and steady win the race” method, placing some strategic ads is gonna help you boost the traffic, and build an audience.

2. FEATURED SNIPPET

Hey isn’t that OUR article?

Ever heard of the featured or SERP snippet? Do you wonder what is SERP snippet? Well, ever searched for something and got a little box of information on top that answers your question right away? That lifesaver is the featured snippet aka another spot on SERP that you can grab.

The featured snippet is curated by Google Knowledge Graph from their library of indexed contents. In order to answer a specific query, Google extracts and reformats relevant information from a single page that has the most fitting answer. Then dumps (well, perhaps more gracefully than dump) it into a box on the top of SERP.

Also called position zero, featured snippet is placed right on top of the SERP. Yes, even before the number 1 ranked web page. According to HubSpot, the featured snippet has almost two times the click-through rate compared to other links listed in the SERP. Which means a whole lot of traffic.

Click through rate for featured snippet is almost twice as much.

Now that we know how much traffic featured snippet can drive, how can we get that spot?

If you have an article that’s at the top 5 spots on SERP, you are already halfway there. Now all that’s left is to ask yourself. Did your article answer the question in a straightforward and concise way?

Well, Google is smart, but it doesn’t mean they don’t need a little push. Editing your content a little to help it fits in the box will go a long way. Both for your visibility and traffic. Now go get that featured snippet spot.

3. GOOGLE IMAGE

Infographics and graphs tend to do well on image search.

A picture is worth a thousand words. There is a reason why the image tab is right beside the all tab on the SERP. Image is actually one of the regular features of SERPS that is largely ignored.

There are those who want to read a 5 thousand words in-depth discussion of a single topic. There are also those who want their answer in a glance, lined in a perfectly constructed chart, or graph or table (you get the idea). There are also those who will only hire your service after they have a look at your work.

Your image will also make it into the featured snippet box. Note that the snippet itself is extracted from another website.

So if you can sketch up a mean infographic or graph, with the perfect ratio of graphics and information, you might have just gotten the key to unlocking a bunch of traffics. Moreover, if you’re an interior designer, hotelier, jeweler, or even a pâtissier, where being able to show your work is important in securing clients, targeting Google image ranking will definitely help.

Now, how can you optimize for Google image searches? If you are into SEO you are probably aware of how important image ALT is. If you want to rank for Google image, there is a list of optimization that you need to do.

Relevant image file name. You don’t want it to be Q2RTZ4GP, let’s make it macaroon-pauls-patisserie instead.

Optimized ALT tag. ALT attributes are used as the text alternative to your picture. Think of it as a description of the image for search engines.

Informative image caption. Captioning an image and draw relevance to the topic not only helps the visitor to understand it better, it also means better optimization.

Inserted in a relevant and optimized page. Image from pages that rank high tends to rank higher on Image search too.

Make sure your image dimension is not out of the norm. Stick to the good old 16 x 9 or 4 x 3.

No heavy image size, please. Keep it under 1MB cause who’s going to look at an image that takes more than a second to load?

Share it on multiple platforms. Be it Flickr, ImageShack, Twitter or Reddit, just spread it like butter.

4. LOCAL SEARCH PACK

Now we’re threading a bit on local SEO water.

One of the most frequently used format on Google search is … near me. The ellipsis could be a restaurant, a plumber service, a hardware store. Basically, any local businesses that are in a close proximity to the user.

Being listed on local search pack is important for small local business cause that means tons of exposure.

Google will basically try to list all the local businesses. But you need to claim it in order to optimize it and use it to your advantage. In the rare case where your business is not on Google’s radar, simply create it. Here is a guide on how to get started on Google My Business (GMB).

After setting up your GMB account doesn’t mean the job is done. The most important factor of this SERP feature is, of course, the proximity of your listing to the user. However, there is also an extensive list of information and setting that you can set up to increase your ranking.

Moz compiled a list of Top 30 Foundational Factors for you to rank on the local search pack. If interested, you can read on their extensive local search ranking factors.

5. PEOPLE ALSO ASK

Google tries to give the best answer to every query, and they try hard. People also ask is one of the SERP features where they try to give the user the most detailed and relevant answer. By suggesting you what similar questions other have asked.

Every question bar you clicked on will show you the answer while simultaneously load more question bars at the bottom that relates to the question you just clicked on. You can go on and on and on and you can basically get all your answers.

Google shows you the answer when you click on the respective question box.

Sometimes, the answer presented is actually the featured snippet of that exact query. Some other times, they are not.

We are not sure how Google decide which to feature and which to not feature. However, one thing that is definite is the page gotta answer the question.

The actual featured snippet of the exact People also ask query in the actual SERP is not the same one displayed in the question box.

The answer featured in the People also ask box is actually extracted from the 10th link of the SERP.

Like all quality content, you need to have a solid and concise writing that is relevant to your targeted keywords.

People also ask, like featured snippet is placed above the organic 10 blue links on white background. That means a whole lot of exposure and also a whole lot of traffic that could be yours.

EXTRA: YOUTUBE VIDEOS

Entertainment queries tend to trigger more YouTube video results.

Youtube is a Google propriety which means it makes perfect sense for them to be featured on the SERP. Featuring videos in the SERP is one of the steps that Google has taken to give a more dynamic result. Some things are just better explained in motion and utterance.

Now, there are some queries out there that can trigger a SERP with half of the results being a video from YouTube. There are also some queries that won’t show you a single video at all.

Informational videos, tutorials, how-tos, reviews and entertainment queries are the most likely to trigger a video result.

So how do you link the youtube videos back to your main website and gain traffic? Make good use of the description box function. Or mention your domain in the video. Videos can be a good way to build up your brand image and authority.

So it might not have a direct impact on your website’s organic traffic. It is another way to gain a place of the Google SERP and spread your brand name.

5 Google SERP Features That Can Give You Tons of Traffic from Jia Thong Lo

Here’s a quick slideshow to help you refresh what you have just read. Now go get those traffics!

#optin-template-3{
float: left;
margin: 0;
width: 100%;
max-width: 654px;
height: 100%;
}
#optin-template-3 .container{
float: left;
width: 100%;
height: 100%;
text-align: center;
background: #fff;
padding-bottom: 16px;
}
#optin-template-3 .top-row{
display: inline-block;
width: 88%;
padding: 3% 6% 0%;
}
#optin-template-3 .top-row h2{
margin: 5px 0 0;
font-family: “roboto”, helvetica, sans-serif;
color:#232b33;
font-weight: 600;
text-align: center;
padding:0px 0px 5px;
font-size:2.2em;
}
#optin-template-3 .left-column{
display: inline-block;
width: 100%;
max-width: 270px;
min-width: 270px;
height: 100%;
vertical-align: top;
padding-top: 32px;
}
#optin-template-3 .ebook-img{
width: 100%;
min-width:270px;
height: 280px;
background: url(https://seopressor.com/wp-content/uploads/2015/09/inbound-marketer-cover2.png);
background-size: contain;
background-repeat: no-repeat;
}
#optin-template-3 .right-column{
display: inline-block;
width: 60%;
min-width: 250px;
max-width: 305px;
padding: 24px 4% 32px;
}
#optin-template-3 .bodycopy ul{
text-align: left;
padding-left: 0;
}
#optin-template-3 .bodycopy ul li{
font-family: “roboto”, helvetica, sans-serif;
margin-left: 20px;
}
#optin-template-3 .optIn-form{
display: block;
bottom: 0;
}
#optin-template-3 .email{
display: block;
width: 100%;
border: 0;
padding: 8px 0;
font-size: 18px;
text-align: center;
border: 1px solid #795548;
}
#optin-template-3 .submit-button{
display: block;
margin-top: 4%;
width: 100%;
padding: 8px 0;
font-family: “roboto”, helvetica, sans-serif;
font-weight: 400;
color: #232b33;
background: #b7daff;
font-size: 21px;
border: 0;
outline: 1px solid #b7daff;
cursor: pointer;
}

Guide in Becoming The Perfect Inbound Marketer

10 must-have qualities of a perfect inbound marketer.
Proven examples on applying the best inbound marketing techniques.
Compact guide with less fluff!

5 Reasons Your Page Is Not Indexed On Google Search and How to Fix It

Posted by on Jul 15, 2018 in SEO Articles | Comments Off on 5 Reasons Your Page Is Not Indexed On Google Search and How to Fix It

5 Reasons Your Page Is Not Indexed On Google Search and How to Fix It

5 Reasons Your Page Is Not Indexed On Google Search and How to Fix It

After hours of coding, writing, designing and optimizing, finally, a new web page has gone live. But hey, why is it not appearing in Google Search? What have I done wrong? Why does Google hate me?

Now, now, we have all been there before. And I have learned to give it at least a day or two before trying to seek out my new blog post on Google Search. Because I have long accepted that that’s how long Google needs to actually put my newborn, I mean my new blog post on Google Search.

The process of getting a new web page or a new website on Google Search is a long and windy one. But it’s one worth learning about.

Let’s start with the basics.

What is Search? How does it works?

Here’s a video from Google, where Matt Cutts tell you a little about how search works.

Have you watched it? If yes, please bear with my little attempt at summarizing.

For a content to appear on Google Search, it has to go through spiders. No, not real spiders, but a program called spider. The spider will start with a link, it will then crawl through the content. If they see another link in the content, they will crawl it too, and the process repeats.

Crawled contents, or web pages, are then stored in Google’s index. When a user made a query, answers are pulled from the index.

So in order for your content to show on Google Search, you have to first make sure your website is crawlable, the Google crawler is called Googlebot. Then you have to make sure it’s indexed correctly by the indexer which is called caffeine. Then only will you see your content appearing on Google Search.

Here’s the thing, how do you check from Google for exactly whether or not they have indexed your content? Well, you can’t. Like all things in SEO, the next best strategy you can do is analyze and give it your best guess.

Try typing into the google search bar site:insertyourdomainhere.com and tab enter. Google Search will give you a list of all indexed web pages from your domain.

But, as Matt Cutts once said, web pages that are not crawled CAN appear on Google Search as well. Well, that’s another topic for another day.

Still interested? The video is only 4 minutes long. You can have a look if you want.

Anyways, let’s get back to topic. For me, I will give it at least a couple days, at most a week, until I start freaking out on why my content is still not appearing on Google Search.

If it has been more than a week, or even a month and your website is still not there.

Here is a list of stuff that you need to consider.

1. Have you checked your robots?

Sometimes a little-overlooked detail can have a big effect.

Robots.txt is the first place that Googlebot visits on a website in order to know which web pages are nofollow or noindex and such.

Do you have this in your HTML head section?

The robots noindex tag is handy to make sure that a certain page will not be indexed, therefore not listed on Google Search.

Commonly used when a page is still under construction, the tag should be removed when the web page is ready to go live.

However, because of its page specific nature, it comes as no surprise that the tag may be removed in one page, but not another. With the tag still applies, your page will not be indexed, therefore not appearing in the search result.

Similarly, an X-Robots-Tag HTTP header can be programmed into the HTTP response. Which can then be used as a site-wide specific alternative for the robots meta tag.

Again, with the tag applied, your page will not show up in Search. Make sure to fix them.

Read more about meta tags here: How To Control Web Crawlers With Robots.txt, Meta Robot Tags & SEOPressor

2. Are you pointing the Googlebot to a redirect chain?

Googlebot is generally a patient bot, they would go through every link they can come across and do their best to read the HTML then pass it to caffeine for indexing.

However, if you set up a long winding redirection, or the page is just unreachable, Googlebot would stop looking. They will literally stop crawling thus sabotaging any chance of your page being indexed.

Not being indexed means not being listed on Google Search.

I’m perfectly aware that 30x are useful and crucial to be implemented. However, when implemented incorrectly, that can ruin not only your SEO but also the user experience.

Another thing is to not mix 301 and 302. Is it moved permanently or moved temporarily? A confused Googlebot is not an efficient Googlebot.

Hear it from Google themselves.

So make sure that all of your pages are healthy and reachable. Fix any inefficient redirect chains to make sure they are accessible by both crawlers and users alike.

3. Have you implemented the canonical link correctly?

A canonical tag is used in the HTML header to tell Googlebot which is the preferred and canonical page in the case of duplicated content.

For example, you have a page that is translated into German. In that case, you’d want to canonical the page back to your default English version.

Every page should, by advise, have a canonical tag.

Either to link it back to itself in the case where it is a unique content. Or link it to the preferred page if it is duplicated.

Here comes the question, is the link you canonical to correct?

In the case of a canonical page and its duplicates, only the canonical page will appear on Google Search. Google uses the canonical tag as an output filter for search.

Meaning, the canonical version will be given priority in ranking.

SEOPressor Connect let you skip the step of manually inputting the canonical tag.

If that is not your purpose, fix your canonical and link it back to itself. That would do the trick.

4. Maybe you have exceeded your Crawl budget

I’m on a budget.

Google has thousands of machines to run spiders, but there are a million more websites out there waiting to be crawled.

Therefore, every spider arrives at your website with a budget, with a limit of how many resources they can spend on you. This is the crawl budget.

Here’s the thing, like mentioned before, if your websites have a lot of redirection chains, that will be unnecessarily eating your crawl budget. Because of that, your crawl budget might be gone before the crawler reaches your new page.

How to know how much is your crawl budget? In your Search Console account, there will be a crawl section where you can check your crawl stats.

Let’s say your website has 500 pages, and Googlebot is only crawling 10 pages on your site per day. That crawl budget will not be efficient enough for the new pages that you’re pumping out.

In that case, there are a few ways to optimize your crawl budget.

First of all, authoritative sites tend to be given a bigger and more frequent crawl budget. So get those backlinks. The more quality and relevant links pointing to your website mean your website IS of good quality and high relevance to your niche.

We all know building up authority doesn’t happen in one day. So another thing that you can do is to make sure that your site can be crawled efficiently.

You need to make good use of your robots.txt file. We all have some pages on our website that don’t really need to be up there in Search like duplicate content, under construction pages, dynamic URLs etc.

You can specify which crawler the instruction applies and which URL strings should not be crawled. As an example:

That way, crawlers won’t be spending unnecessarily budget on pages that don’t need crawling.

A list of the most common user agents includes:

Googlebot (Google)

Bingbot (Bing)

Slurp (Yahoo)

DuckDuckBot (DuckDuckGo)

Baiduspider (Baidu)

YandexBot (Yandex)

facebot (Facebook)

Ia_archiver (Alexa)

One important thing that I have already mentioned above that also applies in optimizing your crawl budget is to fix those redirect chain. They are not only inefficient they are also eating up your crawl budget.

If they are any pages returning with a 40x errors, fix those too.

5. Is your page actually an orphan?

An orphan page is a page that has no internal links. Perhaps the link is faulty causing the page to be unreachable, or during a website migration, the link is accidentally removed.

Remember how the spiders work? They start from one URL and from there they crawl to other URLs that are linked.

An orphan page can’t be crawled because there is no way to be crawled. It is not linked from your website, thus the term orphan. That’s why interlinking is so important because it acts as a bridge for the crawlers from one page of your content to another.

Read more about interlinking here: Why Internal Links Matter To Your SEO Effort?

How can you identify orphan pages?

If you’re like us and you’re using the WordPress CMS you can export a full list of URLs of every pages and content on your website. Use that to compare with the unique URLs found in a site crawl.

Or you can look up on your server’s log file for the list of unique URLs loaded for let’s say the last 3 months. Again, compare that with the list you got from the site crawl.

To make your life easier, you can load those data into an excel file and compare them. The URLs which are not duplicated are the ones that are orphaned.

After knowing what are the orphaned pages, fixing them would be much easier.

Now what you need to do is link those orphan pages appropriately. Make it so they are easily discoverable by users and crawlers alike. Also, don’t forget to update your XML sitemap.

Spidey please do your job instead of laying around getting tied on a railway.

If everything is working nicely, there are no error codes returning, the robots tags are fine, but your page is still not showing up. Why? Well, the issue might very well be from Google’s side. Maybe you are just not the crawler’s priority.

Google only has so many resources. And the bigger and more authoritative websites will be allocated a bigger and more frequent crawl budget.

Here are 5 steps you can take to urge Googlebot to crawl your new pages faster.
1. Update your sitemap in search console

Submitting your sitemap to Google is like telling them “Hey, here check out these important URLs from my website and crawl them!”.

They might not start crawling those URLs immediately but at least you gave them a heads-up.

If you run a huge site that updates constantly. Keeping up with your robots and multiple sitemaps will probably drive you nuts.

Keep in mind though you can’t have noindex, nofollow on your robots and then adding it to your sitemap. So do you want it indexed or not?

To avoid things like to happen, maintaining a dynamic ASPX sitemap will probably be your best choice.

2. Submit your URL directly to Google for indexing

Similarly, Google lets you manually submit your new page to them.

It’s really simple. Just search for submit URL to Google and the search result will return you with an input bar. Now, copy and paste the URL of your new page and click on the submit button.

Voila, you have done submitting a new URL to Google for crawling and indexing.

However, just like the sitemap, this only acts as a heads-up for Google to make them aware of the existence of your new page.

Just do it anyways when your web page has been sitting there for a month and still not being indexed. Doing something is better than nothing right?

3. Use fetch as Google in Search Console

You can request, directly to Google, for a re-crawl and re-index of your page.

That can be done by logging into your Search Console then perform a fetch request via fetch as Google. After making sure that the fetched page appears correctly: all the pictures are loaded, there are no broken scripts etc. You can request for indexing, then choose between the option of crawling only this single URL or any other URLs that are directly linked.

Again, Google warned that the request will not be granted immediately. It can still take up to days or a week for the request to be completed.

But hey, taking an initiative is better than sit and wait right?

4. Get high domain authority

Government websites is one of the example of a high authority website.

Once again, domain authority affects how frequent and how much your crawl budget will be.

If you want your new pages and website changes to be indexed swiftly, you have a better chance if your page rank is high enough.

This is a matter of slow and steady win the race though. If you can get a million backlinks based on one single content in a single day, that’s great.

But one great content is not enough. Your website needs to be updated frequently and consistently with quality content while simultaneously gain quality backlinks for your page authority to go up.

Start updating your website at least twice weekly, reach out to the community to build brand awareness and connections.

Keep that effort up, slowly and steadily your authority will go up and your website will be crawled and indexed much faster.

5. Have a fast page load speed

Here’s the thing, when you have a website that loads fast, Googlebot can, therefore, crawl it faster.

In the unfortunate case where the load speed of your website is not satisfying and requests frequently time out, you’re really just wasting your crawl budget.

If the problem stems from your hosting service you should probably change to a better one. On the other hand, if the problem comes from your website structure itself, you might need to consider cleaning up some codes. Or better yet, make sure it is well optimized.

Read more about page speed and SEO here: The Connection Between Site Speed and SEO Today

#optin-template-3{
float: left;
margin: 0;
width: 100%;
max-width: 654px;
height: 100%;
}
#optin-template-3 .container{
float: left;
width: 100%;
height: 100%;
text-align: center;
background: #fff;
padding-bottom: 16px;
}
#optin-template-3 .top-row{
display: inline-block;
width: 88%;
padding: 3% 6% 0%;
}
#optin-template-3 .top-row h2{
margin: 5px 0 0;
font-family: “roboto”, helvetica, sans-serif;
color: #6fc6bf;
font-weight: 600;
text-align: center;
padding:0px 0px 5px;
font-size:2.2em;
}
#optin-template-3 .left-column{
display: inline-block;
width: 100%;
max-width: 270px;
min-width: 270px;
height: 100%;
vertical-align: top;
padding-top: 32px;
}
#optin-template-3 .ebook-img{
width: 100%;
min-width:270px;
height: 280px;
background: url(https://seopressor.com/wp-content/uploads/2016/09/Page-Speed.png);
background-size: contain;
background-repeat: no-repeat;
}
#optin-template-3 .right-column{
display: inline-block;
width: 60%;
min-width: 250px;
max-width: 305px;
padding: 24px 4% 32px;
}
#optin-template-3 .bodycopy ul{
text-align: left;
padding-left: 0;
}
#optin-template-3 .bodycopy ul li{
font-family: “roboto”, helvetica, sans-serif;
margin-left: 20px;
}
#optin-template-3 .optIn-form{
display: block;
bottom: 0;
}
#optin-template-3 .email{
display: block;
width: 100%;
border: 0;
padding: 8px 0;
font-size: 18px;
text-align: center;
border: 1px solid #6fc6bf;
}
#optin-template-3 .submit-button{
display: block;
margin-top: 4%;
width: 100%;
padding: 8px 0;
font-family: “roboto”, helvetica, sans-serif;
font-weight: 400;
color: #fff;
background: #6fc6bf;
font-size: 21px;
border: 0;
outline: 1px solid #6fc6bf;
cursor: pointer;
}
#optin-template-3 .container .align-justify { text-align:justify !important;}

Top 10 Best Page Speed Tools

A comprehensive list of page speed tools to provide a better user experience.
Identify the top 10 page speed tools that influencers are recommending
Get your visitors to stay longer on your website with these simple tools
Find more benefits of every tool to reduce bounce rate.

Top 10 Best Page Speed Tools

What to consider when selecting marketing channels

Posted by on Jul 15, 2018 in SEO Articles | Comments Off on What to consider when selecting marketing channels

Selecting the marketing channels to use is a complex decision – there’s no one-size-fits-all formula for distributing your product. Whether you’re just getting started or you’ve been in business for a while, it can be difficult to find effective ways to reach your target market.

One way to make the process simpler is to break down all the factors that play into channel selection. Your customer base, your available resources, and your product itself can all help guide your decision. Take the following factors into consideration as you weigh your options, and you’ll have an easier time choosing the right marketing channels for your business.

What are the physical attributes of your product?

Sometimes the physical attributes of a product dictate how it should be distributed. Not everything can be easily shipped, and some items need to be handled more carefully than others.

If your product is large and heavy, for instance, shipping it across the country may not be practical. And if you sell something perishable – like food or cosmetics – you’ll probably want to get that product into customers’ hands as quickly as possible. In cases like these, it’s best to look for a short marketing channel.

However, if your product is durable and easy to ship, you have more options. In this case, a longer distribution channel with more middlemen may give you certain advantages, like a wider distribution area.

What kind of brand image do you want to create?

Your brand’s overall image is shaped by your customers’ buying experience, from start to finish. Where and how a person buys your product is just as important as the quality of the product itself. As you consider your options for distribution channels, ask which support the kind of brand image you want to cultivate.

For instance, if you want people to associate your brand with uniqueness or exclusivity, you probably wouldn’t want to sell your products at Walmart, even if that meant reaching more customers. Rather, you’d probably want to target more exclusive retailers, or even focus on distributing your product online yourself.

How technical is your product?

The more specialized or difficult your product is to use, the more you’ll benefit from using short marketing channels or selling directly to customers. That’s because people are often reluctant to take a risk on an ‘intimidating’ product unless they’ve built up some trust with the business first. Leads must feel assured that you’ll help them with setup and provide tech support if something goes wrong. For example, if you sell specialized software or complex machinery, you’ll probably want to focus on choosing leads carefully and building relationships with them – not distributing your product as far and wide as possible.

Are you selling to individuals or businesses?

Business to business selling requires a different approach than business to consumer selling. If you’re selling to individuals, retail may be a good option for you, since most B2C businesses don’t need to build personal relationships with customers.

However, if you have a B2B business, retail is out of the picture. It’s too impersonal, and it won’t put your product in front of the right customers when they need it. Direct selling or selling through an agent will likely be your best bet.

How big and geographically diverse is your target market?

Where do you want to sell your product, and how many people do you expect to buy it? Look for marketing channels that can accommodate both the area you want to cover and the volume of customers you’re anticipating. For a small, local business, this could mean setting up your own store or selling your product door-to-door. If you want to reach a wider market, the internet is a good option that’s accessible even to small, new businesses.

Where and how does your target market like to shop?

Do some market research and figure out how your target audience prefers to shop. Do they visit retail stores? Do they place bulk orders online? Are they inclined to make impulse purchases, or do they research products carefully before making a decision? Knowing your market’s shopping habits will make it easier to position your product where buyers can find it.

How much time and effort can you spend on distribution?

Distributing a product takes a lot of resources and organization. Handing the product off to a middleman makes the process easier for many businesses. However, if you have the resources to do your own distribution through direct selling or an ecommerce site, you retain control over how your product gets into customers’ hands. You might also make more profits in the long run.

Which marketing channels do your competitors use?

It’s important to know how your competitors sell their products to customers. If you aren’t sure which channels your competitors are using, do a little research to find out.

You can use this information in a couple of ways. The first way is to adopt the same marketing channels your competitors use, or find very similar ones (and this includes social media). This strategy can work well because you know that your competitors’ channels have a built-in market for the types of products you sell.

Another approach is to avoid your competitors’ marketing channels entirely. Instead, look for different channels where your rivals have no reach, and sell there. This can be a very effective way to cut down on competition. However, it can be hard to find marketing channels that are both effective and untapped in your field. If you’re good at thinking outside the box and you have the resources to do plenty of promotion, this strategy might be worth a try.

Which channels offer you the most advantages?

Some marketing channels will offer you more advantageous partnerships than others. Make a list of the channels you’re considering, and ask yourself the following questions:

Will certain middlemen promote your product more than others?
How will your choice of middlemen affect your bottom line? How can you maximize your profits?
Do any channels have particularly favorable or unfavorable policies? For instance, if a potential partner wants the exclusive rights to distribute your product, that might not be a good deal for you
What kind of reputation does each channel have? Is their business financially sound? Are they known for being reliable and pleasant to work with?

The take-away

There are a lot of moving parts to consider as you select marketing channels for your product. Deciding doesn’t have to be overwhelming, though. Weigh these nine important factors, both on their own and in relation to each other, as you consider your options. By putting plenty of thought and analysis into your decision, you’ll give yourself the best possible odds of selecting marketing channels that benefit your business for years to come.

Amanda DiSilvestro is a writer for No Risk SEO, an all-in-one reporting platform for agencies. You can connect with Amanda on Twitter and LinkedIn, or check out her content services at amandadisilvestro.com.

 

Why Internal Links Are as Powerful as External Links with Dawn Anderson

Posted by on Jul 15, 2018 in SEO Articles | Comments Off on Why Internal Links Are as Powerful as External Links with Dawn Anderson

Why Internal Links Are as Powerful as External Links with Dawn Anderson

Enjoy a brand new cognitiveSEO Talks – On Search and Traffic with Dawn Anderson, an all-in-one professional that will make you take out a pen and a sheet and note everything she has to say.

 

There’s a lot to say about Dawn. She is a skillful digital marketer and SEO consultant with over 11 years of experience, Lecturer at Manchester University. She is also a speaker and trainer, contributing in international conferences and she also founded Move It Marketing Agency back in 2012.

 

 

 

As Dawn herself mentioned in the talk above, SEO is about constantly digging and looking for secrets. And talking about secrets, let me tell you something about Dawn; although she seems that she’s been doing SEO& digital marketing since forever, Dawn had a previous career in a totally different domain: she managed a building maintenance company. Maybe one of the greatest things about the search industry is that people come from very diverse backgrounds.

 

There are so many SEO elements you need to keep an eye on, so many ranking factors; we can never guess exactly what they are and this is why every year we end up with more Google ranking factors to consider.

DAWN Anderson

International SEO consultant & Digital Marketing Lecturer  @dawnieando / Move It

 

Aside from being a great SEO consultant, the director of Move it Marketing is also a big dog lover. And if you listen to the interview carefully, you’ll be hearing her cute pomeranian joining the conversation. 

 

We could list dozens of reasons why one should listen to this cognitiveSEO talk episode with Dawn. Yet, we wouldn’t want to spoil you the joy of discovering an engaging conversation sprinkled with insights and words of wisdom.  

 

The key is always to try to make great sites and to develop your technical skills as much as you possibly can.

DAWN Anderson

International SEO consultant & Digital Marketing Lecturer  @dawnieando / Move It

 

 Tackled Topics: 

 

How Dawn got into SEO and search marketing
The importance of testing in SEO
Duplicate content issues for eCommerce sites
Crawl budget and how it impacts a site
Best use cases for PPC and SEO
The impact of voice search in SEO
The Google mobile speed update and its impact

 

  Top 10 Marketing Nuggets:  

 

Make sure you have some small test sites of your own even if you’re brand new in the industry. 2:06
The key is to try to make great sites and to develop your technical skills as much as you possibly can. 10:02
Internal links are for me often as powerful as external links. 11:35
Crawl budget on massive sites can impact rankings. 11:50
Technical issues are like a slow painful death by a thousand cuts on a big website. 14:26
[For e-commerce websites] Add photos, reviews, videos, linking content, xml sitemaps as it’s another way of enhancing your overall website with another layer of data Google does have access to. 23:34
We don’t know if links are being ignored by search engines. It’s a play in the dark. 29:45
Voice Search will massively affect SEO. Maybe not today but in the next couple of years, we’re going to see that. 31:01
No one person in this industry has all the answers. Be critical, read widely, test widely; make sure to try learning all the time. 36:12
If you have slow loading sites you should be looking to address it; one of the biggest thing that you can do is to optimize images. 41:39

The post Why Internal Links Are as Powerful as External Links with Dawn Anderson appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

How to organize your keyword lists

Posted by on Jul 15, 2018 in SEO Articles | Comments Off on How to organize your keyword lists

How to organize your keyword lists

Keyword research is a fundamental tactic that I have seen completely transform the overall marketing strategies of those who take it seriously.

In fact, just about any marketing area begins with keyword research, be it competitive analysis, traffic growth, content planning, or PPC strategy. It has always been the foundation of online marketing and it still is – even though it’s rapidly evolving.

I have seen clients go from barely functioning marketing plans to full-scale content marketing projects that up their rankings and conversions. Keywords are serious weapons.

Why organize keywords?

Keyword lists are messy. They contain every little variation of each particular query because they include whatever enough people spontaneously type into the search box.

We search in a more disorganized way than we speak. For example, we could search ‘research keywords’, ‘how to research keywords’, ‘research keywords how to’, ‘keyword research tips’, or even ‘keyword research how to tips’ – and all of that will basically mean the same thing (i.e. we want to know how to research keywords).

Keyword research tools like SEMrush and Ahrefs will provide you with hundreds of thousands of those keyword strings (as well as marketing inspiration).

But, how do you make sense of all those lists that leave us with a huge pile of dumped keywords organized with no rhyme or reason? How do you turn them into plans and actions?

This is precisely why you should be taking the time to organize your keywords. It might not be a very fun process, but it is a very important one.

Here are some tips.

Usefulness and value

One popular way to organize your keywords is by usefulness of the keyword. How you define that is up to you, but many marketers categorize it by price per click balanced with the projected click rate. They also look at how likely it is that the keyword would help them rank on the first page or (more recently) get a featured snippet:

Top: the absolute best and most expensive keywords that you might try and target in the future, depending on time and budget, as well as how useful the end result would be in light of those factors
Moderate: middle ground keywords that cost less than the top, but have the highest potential within that price level. This is where most of your research should lead you and the largest portion of your spreadsheet is going to be dedicated to these
Bottom: the cheapest keywords to target aren’t worth much when it comes to primary keyphrases. However, you may want keep an eye on them anyway and sometimes look to them either for inspiration on future phrases (to expand on), or as secondary/tertiary phrases for projects that require them.

Featured tool: I like using Ahrefs “clicks” data to determine most useful phrases, i.e. phrases that are able to send a lot of traffic, and those where my site ranks pretty well already:

[No other keyword research tool beats this insight.]

Relevancy

The abundance of keyword strings in your lists often mean pretty much the same thing. They are always in your way preventing you from focusing on other important aspects of keyword research, so getting rid of those (or rather grouping them) is the first thing to do.

This is where keyword clustering comes in handy. I have already explained the tactic in detail here.

Featured tool: Serpstat looks at Google SERPs for each phrase and determines related queries by overlapping URLs. This is pretty much the only tool that can do that, to the best of my knowledge:

Search intent

Another way to organize keywords is by intent, which is usually more straightforward. Set some goals about what you want to accomplish – not just with keyword research but your whole brand. Use that to inform your keyword strategy and separate each goal by intent so you have a list of keywords for each.

Say that you want to target the market for affordable time management programs. You will want to increase brand visibility, get a featured snippet in a popular query and bring more attention to your social media. Make keyword lists for those three goals.

Usually search intent puts keywords into four groups:

Commercial ‘high intent’ intent: these users are ready to buy now
Informational intent: these users are willing to read, not ready to buy but may opt-in and stick around for a bit longer
Transactional intent: these users can be both (researching, then buying)
Navigational intent: these users are interested in a specific brand. Depending on whether that’s your brand or someone else’s, you may want to turn them into believers or snatch them for the competitor.

Featured tool: searching Google itself will give you some idea on what Google has found the intent to be. For example, if you see shopping results, you can be fairly sure Google has come to the conclusion that most of these searchers wanted to buy things.

[Chart source: Digitaleagles.]

Brand-focused queries

These should be a separate tab in your spreadsheet. Every company needs to make it easier for people to find you. Do this based on your brand name, [competitor alternatives], etc., which is an easy way to make sure your bases are covered and a simple way to organize your research.

Another way to do this is to target phrases that are negative and then prove them wrong with content. An example would be a phrase like, “Is [product name] a scam?” When users search it, they will find that no, you are not a scam and are not listed on any scam sites. This reassures them, even though the original search was negative.

Don’t forget to research all kinds of queries your (or your competitors’) brand includes:

[You may also want to label these queries by sentiment to give your content team more clues on how to address each one.]

By modifier

I always do these in their own list. A modifying keyword is one that uses an adjective to describe what is being searched for. For instance, they may search for ‘cheap project management platform’ or ‘free ways to manage teams’.

Words like free, cheap, top, best, etc., are fantastic modifiers and are easy to organize in their own section. Once you have had some trial and error you will know which work best.

Organizing by modifiers helps you evaluate your niche trend to match your content and conversion funnel strategy. Do your potential customers tend to search for cheap or exclusive types? Are they looking for DIY or pre-built solutions? Organizing by modifiers gives all those important answers.

I wrote about this type of keyword organizing in an older article at Moz:

Use a template that includes all relevant information

Finally, make sure you are using as much information as possible. Add volume/clicks, difficulty and anything else you can think to use. You may also consider adding labels for which type of action each keyword requires:

Optimize old content
Create new page.

As well as page type it’s good for, e.g.:

Product page
Product list
Blog post
Video, etc.

There may be more labels if you are optimizing a local business website. Michael Gray described some in his article here.

That information should also include how it is working over time. I have made graphs with Excel using the data and gotten a much clearer picture of what is and isn’t bringing in the results I want. You can tweak from there.

Do you have a tip for organizing keywords? Let us know in the comments.