Google Still Has A Lot of Work To Do When It Comes To Location…

Posted by on Jul 18, 2018 in SEO Articles | Comments Off on Google Still Has A Lot of Work To Do When It Comes To Location…


One of the perks of international travel is getting to see what the SERPs really look like in other countries. Check out this query for “Sams Club”* I did while in Barcelona yesterday:

So Google thinks the “Barcelona” line of Sam’s Club furniture is relevant to me simply because I am in Barcelona. Note my query never specified location. Google just knows there’s a page called “Barcelona” which matches the name of city I am in. If had been in Helsinki and Sam’s had a line of much-needed Helsinki Bullshit Deflectors, it probably would have shown those.

The challenge is that Google often is not clear if a search query has local intent (we have talked a lot about this in our presentations on our Local SEO Ranking Factors Study). Google often is not sure if the searcher wants a document (aka “a web page”) relevant to a location or a document relevant to a word/phrase, entity or whatever. For another good example, see this post on Near Me SEO.

From the above result you can see that Google thinks there could be some local intent to the search (likely because Sam’s has retail locations) so it is showing me the Barcelona pages in the sitelinks. Had it been 100% confident, it likely would have shown the Barcelona URL as the top result (like a store page).

These results are not catastrophes but they could cause some confusion and possible abandonment for less-savvy clickers. They are not what I would call “good for users.” They do illustrate how tricky location can be for an algorithm.

My advice to retailers and any other sites with issues like these is to make it super clear which pages contain relevant location information by using structured markup. You could even link from the Barcelona product pages to the Barcelona store page with the anchor text=”Barcelona”. If Sam’s actually had a Barcelona location, that would make it more likely to appear in these results above the product pages. In extreme cases you may also consider either noindexing these URLs or using the Google Search Console Remove URL tool to get rid of these unwanted results.

This is an edge-case to be sure, but when you are operating at international brand scale, a few thousand edge cases can add up.

*A much-beloved LSG client

The post Google Still Has A Lot of Work To Do When It Comes To Location… appeared first on Local SEO Guide.

Early Results from Split Testing JavaScript for SEO

Posted by on Jul 17, 2018 in SEO Articles | Comments Off on Early Results from Split Testing JavaScript for SEO


We’ve been testing what happens when pages rely on JavaScript to render properly – and one of our first tests showed an uplift when we removed a reliance on JS:

When @distilled ran an SEO split test to remove a reliance on JavaScript, they saw an uplift in search performance

— Will Critchlow (@willcritchlow) May 25, 2017

As many of you know, at Distilled we believe that it’s increasingly important to be testing your hypotheses about what will affect your search performance. As digital channels mature, and as Google rolls more and more ML into the algorithm, it’s increasingly hard to rely on best practices. To make this easier, we have been rolling out our SEO split testing platform, Distilled ODN (Optimization Delivery Network) to more and more of our clients and customers.

As we get our platform deployed on a wider range of sites with different architectures and technologies, we’re able to start testing more and more of the assumptions and best practices held around the industry.

You can check out a bunch of case studies that we have already published on our site: structured data, internal linking and meta description, title and header tags – and you can find more details in this presentation (particularly slide 73 onwards) that my colleague Dom gave at a recent conference. We also included some teaser information in this post about a big win that added £100k / month in revenue for one customer even while only deployed on the variant pages (half of all pages).

One thing that we were excited to get to test was the impact of JavaScript. Google has been talking about rendering JavaScript and indexing the resulting DOM for some time, and others around the industry have been testing various aspects of it, figuring out when it times out, and finding out the differences between inline, external, and bundled JS.

Sign up to find out more about our new ODN platform, for a scientific approach to SEO.


The hypothesis: there is a downside to relying on JS indexation

I wrote a Moz post on how I believe that JavaScript rendering and indexation works at Google, but the very short version is that I think it happens in a separate process / queue to both crawling and regular indexing. I think there is a delay downside, and possibly even more than that.

We recently had a chance to observe some of the effects of JS in the wild. One of our consulting clients – iCanvas – was relying on JavaScript to display some of the content and links on their category pages (like this one). Most of our customers on the ODN platform are not consulting clients of Distilled, but iCanvas is a consulting client with the ODN deployed (I’ve written before about how the ability to split-test is changing SEO consulting).

With JavaScript disabled, there were a load of products that were not visible, and the links to the individual product pages were missing (it’s worth noting that the pages showed up correctly in fetch and render in Google Search Console). We wanted to make the consulting recommendation that performance may be improved by showing this information without relying on JavaScript – but this is the classic kind of recommendation that is hard to make without solid evidence. There is clearly a cost to making this change, and it’s hard to know how much of a benefit there is.

Before our change, the pages looked like this with JS disabled:

After the change, they looked more like this (which is exactly how they used to look with JS enabled):

[It’s worth noting that although the change we made here was technically a CSS change, the test is measuring the effect of removing JavaScript dependence – we just moved a feature from JS-reliant to non-JS-reliant]

Split-testing the effect of JavaScript

Using our split-testing platform, we rolled out a change to 50% of the category pages to change them so that the information users might be looking for was visible on page load even with JavaScript disabled. The other 50% of pages remained unchanged and continued to rely on JavaScript.

We then automatically compared the performance of the updated pages with a forecast of what would have happened if we had not rolled out the change (this is called a “counterfactual”) [more here]. This showed that the pages we had updated to remove the reliance on JavaScript were getting a statistically-significant amount more traffic than we would have expected if the change had no effect:

The platform’s analysis showed a greater than 6% uplift in organic search performance to these set of pages, which amounted to over 3,000 additional sessions per month. This was an amazing win for such a small change (the chart above comes from the dashboard built into our ODN platform).

As an aside, the mathematicians on our team are constantly working on refinements to the way we detect uplifts with statistical confidence (see Google’s paper Inferring causal impact using Bayesian structural time-series models for more background). We use a variety of synthetic tests, null tests and cross-checked data sources to make improvements to the accuracy and sensitivity of the automated analysis. We also apply a variety of treatments to the analytics data to account for various behaviours (dominant pages, sparse traffic distribution, seasonal products etc.), as well as some modifications to how Google’s Causal Impact methodology is employed.

In the test above we have since improved the accuracy of the analysis (it did even better than the initial analysis suggested!), which is exciting. It also means we are capable of detecting tests that result in smaller uplifts than previously possible, helping lead to improved performance and improved attribution.

It’s possible that even when rendered, JavaScript hinders search performance

The main takeaway is that you should avoid the assumption that JavaScript-driven pages will perform well in search even if Google is able to render them. We need to continue running more JS tests, but in the meantime, we strongly recommend testing whether your reliance on JavaScript is hurting your site’s organic search performance.

Does Syndicated Content Work? Will It Help or Will It Hurt Your SEO?

Posted by on Jul 17, 2018 in SEO Articles | Comments Off on Does Syndicated Content Work? Will It Help or Will It Hurt Your SEO?


Content syndication can be a great way to drive traffic to your site, get exposure and maybe even improve you rankings. But, on the other side, it can also turn out to be devastating, if played wrong.


The main concern when it comes to syndication is duplicate content. Will Google penalize you if you take other people’s content and place it on your own website? Will it hurt your SEO? Then, there’s also a concern regarding rankings. If you have a small website, will you be able to rank above the authoritative sources you’ve posted your content on?



What Is Content Syndication?

Syndication vs. Guest Posting

Does Content Syndication Affect SEO & Traffic?

Duplicate Content, Backlinks & Penalties
When You Syndicate Other Websites’ Content
When Other Websites Syndicate Your Content
Press Releases

Should Syndicated Content Be Indexed or Not? Will It Outrank the Original Content?
Dealing With Stolen Content
An Alternative to Content Syndication


Well, there are many variables and, depending on each case, you should take proper action. In this article, we’ll try to cover as much as possible and help you decide whether content syndication will work for you or if it’s better to stay away from it.

What Is Content Syndication?


Content syndication is the process of giving other publishers the right to republish your content. It can often happen in all types of media and it isn’t restricted only to web. Content syndication was popular in TV, radio and print as well.


So, in a nutshell, content syndication is when you publish your content on other websites. Why would you do that? Well, there could be a number of reasons, but the biggest one is gaining exposure and, potentially, traffic.


You can also publish other pieces of content on your website, in order to attract visitors to your website and make revenue through ads. It goes both ways.

Syndication vs. Guest Posting


If you thought this kind of sounds like guest posting, then you’re right. There are some similarities. However, there are some differences that set them apart pretty far from one another.


Both guest posts and syndicated content have kind of the same purpose: to bring traffic and backlinks to your website. This way, they’re similar. But, for once, guest posts are unique content (or at least they should be) while syndicated content is duplicate content. You: Wait what? Duplicate content? Isn’t that like bad? It depends. But we’ll talk about this soon enough, so keep reading.


Content syndication is also a lot more scalable guest posting. After all, there are only so many guest posts you can write in a month. With syndication, you also get to post the content on your website and own it. With guest posting, you don’t get that luxury. So, if a guest post is really successful, other websites will benefit from the traffic.


Content syndication is easier to scale than guest posting and you also have the advantage of owning the content.


However, it might be a lot easier to land a guest post somewhere than to syndicate content, because everybody loves unique content.

Does Content Syndication Affect SEO & Traffic?


When it comes to publishing your content somewhere else or publishing other content on your website, there are a lot of concerns that people have. How will this impact their rankings, traffic and image? Will it help you rank higher and get more traffic, or will it affect your rankings negatively?

Duplicate Content, Backlinks & Penalties


The biggest concerns people have when syndicating content are duplicate content and backlink penalties. Let’s go through both of them to find out more.


Duplicate Content Penalty:


It’s true, syndicated content is considered duplicate content. However, the duplicate content penalty is just a myth. Google doesn’t penalize websites for duplicate content. At least, not the way you think.



So, there you have it. Google doesn’t have a duplicate content penalty. It does however, penalize websites that scrape content or spam the web using duplicate content. If they provide no value at all, then Google might take manual action. But that’s not really about duplicate content as much as it is about spam.



However, Google likes it when you provide unique content and specify your duplicates. If there are 5 URLs with the same content, which one should Google rank? If you don’t help the bot, it will decide on its own at some point and send the other pages into the omitted results. You’re better off with unique content.


Google simply knows that users like diversity. They don’t like to see the same content ranking over and over again. So if you want to call Google’s omitted results a penalty, fine. However, a general website penalty for duplicate content is just a myth.


Link Penalties: Dofollow or Nofollow?


What Google always advises is to avoid any sort of scalable link schemes. Content syndication kind of fits that criteria, so you should definitely be careful where you syndicate your content. Make sure the websites you post on and get links from are decent.


If you set partnerships with multiple publishers, then it’s probably a good idea to mark your backlinks what a nofollow tag. That way you’ll be sure that nothing bad can happen, especially if you’re paying for the post. And, just in case you’re wondering, nofollow links are useful for SEO.


Using nofollow links back to your site will ensure that you won’t get penalized. Nonetheless, if the offer isn’t incentivized, dofollow links are good.


However, if some webmaster reaches out to you to ask your for permission to republish your content because they think your content is awesome, feel free to get a dofollow link from them. As John Mueller mentioned above, as long as you’re not spammy, you shouldn’t worry.

When You Syndicate Other Websites’ Content


If you’re planning on publishing other people’s content on your website, then there are some things you need to consider.


First of all, you probably won’t be ranking that content in Google. You therefore need another source of traffic for the content, otherwise it won’t be of much help. If you have social media following or generating traffic some other way, then you shouldn’t have an issue. Second, the owners of the content might have different requirements, such as highlighting their websites at the top of the post, requesting a backlink or even a canonical tag.


Some websites follow this business model. They just republish great content from around the web and drive traffic to their site with it. However, if you think that’s easy-peasy to do, also consider that it takes years and a lot of hard work to build an audience and generate traffic to the website without spending more than earning.




You don’t have to write content
You get to publish great content from multiple sources




Owners might ask for canonicals, so you won’t drive traffic from Google
Can get into copyright issues if you don’t ask for permission
Google might think you’re spamming or using an autoblog plugin if you overdo it


How to do it the right way:


If you’re planning to syndicate content on your site… Wait, let me rephrase that. If you want to syndicate content on your site, you need a plan. You’ll need to know your sources, make sure that they agree with you republishing their content and also make sure that you know where you’ll generate your traffic from. Scaling this with autoblogs is a bad idea, as it can get you into some legal issues as well as penalize your domain for web spam.

When Other Websites Syndicate Your Content


Getting your content on other websites is great. You can drive traffic back to your website and establish authority. However, it’s not as easily done as said. In order to get featured on worthy websites, you need worthy content. You should be posting quality content anyway, so this shouldn’t be an issue. When you give away your content, if someone (hypothetically) shares it somewhere you haven’t thought of and gains 10,000 visitors, you won’t get any of that traffic. They will build e-mail lists and make ad revenue on your content.


Make sure that your content is indexed as soon as you post it. You can do this via the Google Search Console, in the Crawl > Fetch & Render section. Once Google fetches and renders the content, you can request indexing for that link. Do this for every post before you start distributing it. If it gets indexed first on another website, you might get into trouble and not be able to outrank it.


You might also notice that people won’t just simply share your content when you pitch them. They will want you to share theirs as well. I post your content, you post mine, right? Well, if you successfully pitch this deal to 10 blogs, they only have to post once, but you’ll have to post 10 different articles on your site. You’ll end up with one popular and original piece of content on your site, and 10 copied, duplicate posts.


Also, Google values one way links more than link exchanges. If you scale link exchanges it might even consider it some sort of link scheme.




Nice way of promoting content and possibly even drive some traffic
Great way of building authority
Can get you quality backlinks




People might want to exchange favors or money for it
You don’t make revenue from ads and can’t build e-mail lists
There’s a risk you’ll get outranked


How to do it the right way:


Once you have your great piece of content, first make sure it’s indexed before you pitch. You want to be the first, so that Google doesn’t think you’re the one copying content. Then you need to pitch it to quality websites in your niche. Make sure you’re able to get a rel=”canonical” back to your original post. If not, at least get a backlink, be it follow or nofollow.


A good place to start is Medium. You can easily republish your content there, because Medium offers you the option to add a canonical URL. This way, if the medium post gets to the top of Google using its authority, your website will show instead.

Press Releases


Press releases are also a form of syndicated content, but they act a little bit differently. First of all, they’re often times paid, at least on the web.


Another difference is the fact that you don’t post it on your site. This means that you won’t be facing the duplicate content issue. However, it’s not excluded for news publishers to request original content on their website, meaning that the press release will be more like a guest post rather than syndicated content. Some publishers will write their own content, which is great, but others will ask you to do it.



Since you’re scaling this, you should definitely use nofollow links, as Google recommends. The posts are about you, anyway, so you’ll probably get more traffic than with regular syndicated content, as you can pitch your product/service or website right at the beginning of the post.


However, a canonical tag doesn’t make sense here, since the content isn’t duplicate. The link will help, but you could still get outranked if your press release gets posted on a high authority website. To avoid this, simply optimize the press release for a slightly different keyword than the main ones you want to target with your website.

Should Syndicated Content Be Indexed or Not? Will It Outrank the Original Content?


Some time ago, I asked John Mueller for advice, as someone took my content and posted it on their website before I even had a change to get it indexed.


We just rank the pages, we don’t decide who owns the content. If it’s a legal issue, you might want to get legal advice instead..

— John ☆.o(≧▽≦)o.☆ (@JohnMu) September 22, 2017


John’s answer was kind of disappointing back then. Unless the webmasters would agree to link to me or add a canonical link, there was pretty much nothing to do except a copyright strike.


However, recently, I’ve also found this piece of information:



So… Apparently Google does care about which content was indexed first. Theoretically, as long as your site has been indexed, it should be ranking first. So, right after you post, just use the search console to index it quickly and you’ll be fine.


Well… not so fast. This might work if you’re already a somewhat established website, but if you’re site is 5 days old and you get to republish your article on CNN, don’t expect to rank above it.


We’ve had some scraper site outrank cognitiveSEO in Google Image Search with our own featured image. How did we find out? Because Google picked it for an answer box. We took a look at the URL and it wasn’t ours. I can’t recall if it was indexed before us, but luckily, Google figured it out and things got fixed pretty quickly. However, this proves that you can sometimes get outranked when syndicating content.


Here’s another example of someone “syndicating” our content. However, Google was smart enough to rank us at the top and show the other site only in the omitted results. The scrapers removed all the backlinks from the post and didn’t even mention the source. The only links we’re getting from it (which are probably harmful anyway) are the ones from the image sources.



As long as you’re genuinely building relationships and actively doing things to benefit the users, there shouldn’t be an issue with syndicated content.


Even though there are risks involved, syndicated content should be indexed, otherwise Google would never know that the original source is so popular.

Dealing With Stolen Content


Stolen content is syndicated content that doesn’t have your permission. Often times, your content is scraped by bots and automatically published on various websites, as mentioned above. 



However, webmasters might also republish your content without permission. This shouldn’t bother you too much if you’re ranking high already. However, if you’re not, it can potentially harm you. I’ll share a personal story with you:


Once upon a time, when I first started my SEO blog in Romanian, I wrote a very successful blog post that drove a lot of traffic to my site on the first day. Because I got so excited, I forgot to submit it for indexing. A big publisher liked it so much that it reposted it.


At first, I was kind of proud and happy, even though they didn’t ask me if I agreed, but then I realized my mistake. In a matter of days, their content was ranking on page 1 of Google for the keyword I targeted and my article was nowhere to be found. That Tweet, above, to John Mueller was actually about this situation. Luckily I managed to get a rel=”canonical” from them and soon Google ranked my content instead of theirs.


However, regarding the cognitiveSEO issue where some scraper site was ranking images above us, we actually couldn’t get in touch with them, so we instead decided to contact their hosting provider and report the copyright infringement. You can find out their hosting provider with tools like Who Is Hosting This.


So, you have to first try and reach out to see if you can get a backlink or canonical URL. If not, you can also ask them to remove your content, due to copyright. Reporting this to their hosting provider can get them suspended, so they will probably comply. However, if everything else fails, you can also file a DMCA Report using this tool from Google. Select See more products, then Web Search.

An Alternative to Content Syndication


I haven’t mentioned this yet, but the most important part in content syndication is building relationships with other webmasters. And you know what? You don’t have to post their content on your site. You can simply share it on your social media account. They will eventually share yours as well.


Building relationships with others can be extremely rewarding. And what better way to build relationships than to recommend their work? However, make sure you genuinely enjoy their products, services and content. People will eventually figure out if you try to kiss everyone’s… you get the point. They won’t like it. And don’t just share their content. Engage. Build a connection.


Social media is a great way to build connections and share relevant content without having to deal with any of the complications of content syndication.


If you don’t feel comfortable sharing your competitor’s content, you can always take a step down in your niche. For example if you’re in the SEO field, you might share content within the digital marketing field. If you’re into bikes, you can share sports (roller skates, scooters) or outdoor activities.




In the end, it’s a personal decision, if not a matter of business models and website purposes if you want to start syndicating content. Content syndication can be a great way to promote your content and make a name for yourself. However, you must also embrace the fact that content syndication revolves around an audience rather than search generated traffic and if you overdo it, you might get in trouble.


How about you? Have you ever used content syndication as a way to promote your blog? If yes, how did it go? Did you gain traffic? Were you ever outranked? Let us know in the comments section, we really want to find out!

The post Does Syndicated Content Work? Will It Help or Will It Hurt Your SEO? appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

The Local SEO’s Guide to the Buy Local Phenomenon: A Competitive Advantage for Clients

Posted by on Jul 17, 2018 in SEO Articles | Comments Off on The Local SEO’s Guide to the Buy Local Phenomenon: A Competitive Advantage for Clients


Posted by MiriamEllis

Photo credit: Michelle Shirley

What if a single conversation with one of your small local business clients could spark activity that would lead to an increase in their YOY sales of more than 7%, as opposed to only 4% if you don’t have the conversation? What if this chat could triple the amount of spending that stays in their town, reduce pollution in their community, improve their neighbors’ health, and strengthen democracy?

What if the brass ring of content dev, link opportunities, consumer sentiment and realtime local inventory is just waiting for you to grab it, on a ride we just haven’t taken yet, in a setting we’re just not talking about?

Let’s travel a different road today, one that parallels our industry’s typical conversation about citations, reviews, markup, and Google My Business. As a 15-year sailor on the Local SEO ship, I love all this stuff, but, like you, I’m experiencing a merging of online goals with offline realities, a heightened awareness of how in-store is where local business successes are born and bred, before they become mirrored on the web.

At Moz, our SaaS tools serve businesses of every kind: Digital, bricks-and-mortar, SABs, enterprises, mid-market agencies, big brands, and bootstrappers. But today, I’m going to go as small and as local as possible, speaking directly to independently-owned local businesses and their marketers about the buy local/shop local/go local movement and what I’ve learned about its potential to deliver meaningful and far-reaching successes. Frankly, I think you’ll be as amazed as I’ve been.

At the very least, I hope reading this article will inspire you to have a conversation with your local business clients about what this growing phenomenon could do for them and for their communities. Successful clients, after all, are the very best kind to have.

What is the Buy Local movement all about?

What’s the big idea?

You’re familiar with the concept of there being power in numbers. A single independent business lacks the resources and clout to determine the local decisions and policies that affect it. Should Walmart or Target be invited to set up shop in town? Should the crumbling building on Main St. be renovated or demolished? Which safety and cultural services should be supported with funding? The family running the small grocery store has little say, but if they join together with the folks running the bakery, the community credit union, the animal shelter, and the bookstore … then they begin to have a stronger voice.

Who does this?

Buy Local programs formalize the process of independently-owned businesses joining together to educate their communities about the considerable benefits to nearly everyone of living in a thriving local economy. These efforts can be initiated by merchants, Chambers of Commerce, grassroots citizen groups, or others. They can be assisted and supported by non-profit organizations like the American Independent Business Alliance (AMIBA) and the Institute for Local Self-Reliance (ILSR).

What are the goals?

Through signage, educational events, media promotions, and other forms of marketing, most Buy Local campaigns share some or all of these goals:

Increase local wealth that recirculates within the community
Preserve local character
Build community
Create good jobs
Have a say in policy-making
Decrease environmental impacts
Support entrepreneurship
Improve diversity/variety
Compete with big businessesDo Buy Local campaigns actually work?

Yes – research indicates that, if managed correctly, these programs yield a variety of benefits to both merchants and residents. Consider these findings:

1) Healthy YOY sales advantages

ILSR conducted a national survey of independent businesses to gauge YOY sales patterns. 2016 respondents reported a good increase in sales across the board, but with a significant difference which AMIBA sums up:

“Businesses in communities with a sustained grassroots “buy independent/buy local” campaign reported a strong 7.4% sales increase, nearly doubling the 4.2% gain for those in areas without such an alliance.”

2) Keeping spending local

The analysts at Civic Economics conducted surveys of 10 cities to gauge the local financial impacts of independents vs. chain retailers, yielding a series of graphics like this one:

While statistics vary from community to community, the overall pattern is one of significantly greater local recirculation of wealth in the independent vs. chain environment. These patterns can be put to good use by Buy Local campaigns with the goal of increasing community-sustaining wealth.

3) Keeping communities employed and safe

Few communities can safely afford the loss of jobs and tax revenue documented in a second Civic Economics study which details the impacts of Americans’ Amazon habit, state by state and across the nation:

While the recent supreme court ruling allowing states to tax e-commerce models could improve some of these dire numbers, towns and cities with Buy Local alliances can speak plainly: Lack of tax revenue that leads to lack of funding for emergency services like fire departments is simply unsafe and unsustainable. A study done a few years back found that ⅔ of volunteer firefighters in the US report that their departments are underfunded with 86% of these heroic workers having to dip into their own pockets to buy supplies to keep their stations going. As I jot these statistics down, there is a runaway 10,000 acre wildfire burning a couple of hours north of me…

Meanwhile, is pointing out,

“According to the Bureau of Labor Statistics, since the end of the Great Recession, small businesses have created 62 percent of all net new private-sector jobs. Among those jobs, 66 percent were created by existing businesses, while 34 percent were generated through new establishments (adjusted for establishment closings and job losses)”.

When communities have Go Local-style business alliances, they are capitalizing on the ability to create jobs, increase sales, and build up tax revenue that could make a serious difference not just to local unemployment rates, but to local safety.

4) Shaping policy

In terms of empowering communities to shape policy, there are many anecdotes to choose from, but one of the most celebrated surrounds a landmark study conducted by the Austin Independent Business Alliance which documented community impacts of spending at the local book and music stores vs. a proposed Borders. Their findings were compelling enough to convince the city not to give a $2.1 million subsidy to the now-defunct corporation.

5) Improving the local environment

A single statistic here is incredibly eye opening. According to the US Department of Transportation, shopping-related driving per household more than tripled between 1969-2009.

All you have to do is picture to yourself the centralized location of mainstreet businesses vs. big boxes on the outskirts of town to imagine how city planning has contributed to this stunning rise in time spent on the road. When residents can walk or bike to make daily purchases, the positive environmental impacts are obvious.

6) Improving residents’ health and well-being

A recent Cigna survey of 20,000 Americans found that nearly half of them always or sometimes feel lonely, lacking in significant face-to-face interactions with others. Why does this matter? Because the American Psychological Association finds that you have a 50% less chance of dying prematurely if you have quality social interactions.

There’s a reason author Jan Karon’s “Mitford” series about life in a small town in North Carolina has been a string of NY Times Best Sellers; readers and reviewers continuously state that they yearn to live someplace like this fictitious community with the slogan “Mitford takes care of its own”. In the novels, the lives of residents, independent merchants, and “outsiders” interweave, in good times and bad, creating a support network many Americans envy.

This societal setup must be a winner, as well as a bestseller, because the Cambridge Journal of Regions published a paper in which they propose that the concentration of small businesses in a given community can be equated with levels of public health.

Beyond the theory that eating fresh and local is good for you, it turns out that knowing your farmer, your banker, your grocer could help you live longer.

7) Realizing big-picture goals

Speaking of memorable stories, this video from ILSR does a good job of detailing one view of the ultimate impacts independent business alliances can have on shaping community futures:

I interviewed author and AMIBA co-founder, Jeff Milchen, about the good things that can happen when independents join hands. He summed it up,

“The results really speak for themselves when you look at what the impact of public education for local alliances has been in terms of shifting culture. It’s a great investment for independent businesses to partner with other independents, to do things they can’t do individually. Forming these partnerships can help them compete with the online giants.”Getting going with a Go Local campaign, the right way

If sharing some of the above with clients has made them receptive to further exploration of what involvement in an independent business alliance might do for them, here are the next steps to take:

First, find out if a Go Local/Shop Local/Buy Local/Stay Local campaign already exists in the business’ community. If so, the client can join up.
If not, contact AMIBA. The good folks there will know if other local business owners in the client’s community have already expressed interest in creating an alliance. They can help connect the interested parties up.
I highly, highly recommend reading through Amiba’s nice, free primer covering just about everything you need to know about Go Local campaigns.
Encourage the client to publicize their intent to create an alliance if none exists in their community. Do an op ed in the local print news, put it on social media sites, talk to neighbors. This can prompt outreach from potential allies in the effort.
A given group can determine to go it alone, but it may be better to rely on the past experience of others who have already created successful campaigns. AMIBA offers a variety of paid community training modules, including expert speakers, workshops, and on-site consultations. Each community can write in to request a quote for a training plan that will work best for them. The organization also offers a wealth of free educational materials on their website.
According to AMIBA’s Jeff Milchen, a typical Buy Local campaign takes about 3-4 months to get going.

It’s important to know that Go Local campaigns can fail, due to poor execution. Here is a roundup of practices all alliances should focus on to avoid the most common pitfalls:

Codify the definition of a “local” business as being independently-owned-and-run, or else big chain inclusion will anger some members and cause them to leave.
Emphasize all forms of local patronage; campaigns that stick too closely to words like “buy” or “shop” overlook the small banks, service area businesses, and other models that are an integral part of the independent local economy.
Ensure diversity in leadership; an alliance that fails to reflect the resources of age, race, gender/identity, political views, economics and other factors may wind up perishing from narrow viewpoints. On a related note, AMIBA has been particularly active in advocating for business communities to rid themselves of bigotry. Strong communities welcome everyone.
Do the math of what success looks like; education is a major contributing factor to forging a strong alliance, based on projected numbers of what campaigns can yield in concrete benefits for both merchants and residents.
Differentiate inventory and offerings so that independently-owned businesses offer something of added value which patrons can’t easily replicate online; this could be specialty local products, face-to-face time with expert staff, or other benefits.
Take the high road in inspiring the community to increase local spending; campaigns should not rely on vilifying big and online businesses or asking for patronage out of pity. In other words, guilt-tripping locals because they do some of their shopping at Walmart or Amazon isn’t a good strategy. Even a 10% shift towards local spending can have positive impacts for a community!
Clearly assess community resources; not every town, city, or district hosts the necessary mix of independent businesses to create a strong campaign. For example, approximately 2.2% of the US population live in “food deserts”, many miles from a grocery store. These areas may lack other local businesses, as well, and their communities may need to create grassroots campaigns surrounding neighborhood gardens, mobile markets, private investors and other creative solutions.

In sum, success significantly depends on having clear definitions, clear goals, diverse participants and a proud identity as independents, devoid of shaming tactics.

Circling back to the Web — our native heath!

So, let’s say that your incoming client is now participating in a Buy Local program. Awesome! Now, where do we go from here?

In speaking with Jeff Milchen, I asked what he has seen in terms of digital marketing being used to promote the businesses involved in Buy Local campaigns. He said that, while some alliances have workshops, it’s a work in progress and something he hopes to see grow in the future.

As a Local SEO, that future is now for you and your fortunate clients. Here are some ways I see this working out beautifully:

Basic data distribution and consistency

Small local businesses can sometimes be unaware of inconsistent or absent local business listings, because the owners are just so busy. The quickest way I know to demo this scenario is to plug the company name and zip into the free Moz Check Listing tool to show them how they’re doing on the majors. Correct data errors and fill in the blanks, either manually, or, using affordable software like Moz Local. You’ll also want to be sure the client has a presence on any geo or industry-specific directories and platforms. It’s something your agency can really help with!

A hyperlocalized content powerhouse

Build proud content around the company’s involvement in the Buy Local program.

Write about all of the economic, environmental, and societal benefits residents can support by patronizing the business.
Motivated independents take time to know their customers. There are stories in this. Write about the customers and their needs. I’ve even seen independent restaurants naming menu items after beloved patrons. Get personal. Build community.
Don’t forget that even small towns can be powerful points of interest for tourists. Create a warm welcome for travelers, and for new neighbors, too!
Link building opportunities of a lifetime

Local business alliances form strong B2B bonds.

Find relationships with related businesses that can sprout links. For example, the caterer knows the wedding cake baker, who knows the professional seamstress, who knows the minister, who knows the DJ, who knows the florist.
Dive deep into opportunities for sponsoring local organizations, teams and events, hosting and participating in workshops and conferences, offering scholarships and special deals.
Make fast friends with local media. Be newsworthy.
A wellspring of sentiment

Independents form strong business-to-community bonds.

When a business really knows its customers, asking for online reviews is so much easier. In some communities, it may be necessary to teach customers how to leave reviews, but once you get a strategy going for this, the rest is gravy.
It’s also a natural fit for asking for written and video testimonials to be published on the company website.
Don’t forget the power of Word of Mouth Marketing, while you’re at it. Loyal patrons are an incredible asset.
The one drawback could be if your business model is one of a sensitive nature. Tight-knit communities can be ones in residents may be more desirous of protecting their privacy.Digitize inventory easily

30% of consumers say they’d buy from a local store instead of online if they knew the store was nearby (Google). Over half of consumers prefer to shop in-store to interact with products (Local Search Association). Over 63% of consumers would rather buy from a company they consider to be authentic over the competition (Bright Local).

It all adds up to the need for highly-authentic independently-owned businesses to have an online presence that signals to Internet users that they stock desired products. For many small, local brands, going full e-commerce on their website is simply too big of an implementation and management task. It’s a problem that’s dogged this particular business sector for years. And it’s why I got excited when the folks at AMIBA told me to check out Pointy.

Pointy offers a physical device that small business owners can attach to their barcode scanner to have their products ported to a Pointy-controlled webpage. But, that’s not all. Pointy integrates with the “See What’s In Store” inventory function of Google My Business Knowledge Panels. Check out Talbot’s Toyland in San Mateo, CA for a live example.

Pointy is a startup, but one that is exciting enough to have received angel investing from the founder of WordPress and the co-founder of Google Maps. Looks like a real winner to me, and it could provide a genuine answer for brick-and-mortar independents who have found their sales staggering in the wake of Amazon and other big digital brands.

Local SEOs have an important part to play

Satisfaction in work is a thing to be cherished. If the independent business movement speaks to you, bringing your local search marketing skills to these alliances and small brands could make more of your work days really good days.

The scenario could be an especially good fit for agencies that have specialized in city or state marketing. For example, one of our Moz Community members confines his projects to South Carolina. Imagine him taking it on the road a bit, hosting and attending workshops for towns across the state that are ready to revitalize main street. An energetic client roster could certainly result if someone like him could show local banks, grocery stores, retail shops and restaurants how to use the power of the local web!

Reading America

Our industry is living and working in complex times.

The bad news is, a current Bush-Biden poll finds that 8/10 US residents are “somewhat” or “very” concerned about the state of democracy in our nation.

The not-so-bad news is that citizen ingenuity for discovering solutions and opportunities is still going strong. We need only look as far as the runaway success of the TV show “Fixer Upper”, which drew 5.21 million viewers in its fourth season as the second-largest telecast of Q2 of that year. The show surrounded the revitalization of dilapidated homes and businesses in and around Waco, Texas, and has turned the entire town into a major tourist destination, pulling in millions of annual visitors and landing book deals, a magazine, and the Magnolia Home furnishing line for its entrepreneurial hosts.

While not every town can (or would want to) experience what is being called the “Magnolia effect”, channels like HGTV and the DIY network are heavily capitalizing on the rebirth of American communities, and private citizens are taking matters into their own hands.

There’s the family who moved from Washington D.C. to Water Valley, Mississippi, bought part of the decaying main street and began to refurbish it. I found the video story of this completely riveting, and look at the Yelp reviews of the amazing grocery store and lunch counter these folks are operating now. The market carries local products, including hoop cheese and milk from the first dairy anyone had opened in 50 years in the state.

There are the half-dozen millennials who are helping turn New Providence, Iowa into a place young families can live and work again. There’s Corning, NY, Greensburg, KS, Colorado Springs, CO, and so many more places where people are eagerly looking to strengthen community sufficiency and sustainability.

Some marketing firms are visionary forerunners in this phenomenon, like Deluxe, which has sponsored the Small Business Revolution show, doing mainstreet makeovers that are bringing towns back to life. There could be a place out there somewhere on the map of the country, just waiting for your agency to fill it.

The best news is that change is possible. A recent study in Science magazine states that the tipping point for a minority group to change a majority viewpoint is 25% of the population. This is welcome news at a time when 80% of citizens are feeling doubtful about the state of our democracy. There are 28 million small businesses in the United States – an astonishing potential educational force – if communities can be taught what a vote with their dollar can do in terms of giving them a voice. As Jeff Milchen told me:

“One of the most inspiring things is when we see local organizations helping residents to be more engaged in the future of their community. Most communities feel somewhat powerless. When you see towns realize they have the ability to shift public policy to support their own community, that’s empowering.”

Sometimes, the extremes of our industry can make our society and our democracy hard to read. On the one hand, the largest brands developing AI, checkout-less shopping, driverless cars, same-day delivery via robotics, and the gig economy win applause at conferences.

On the other hand, the public is increasingly hearing the stories of employees at these same companies who are protesting Microsoft developing face recognition for ICE, Google’s development of AI drone footage analysis for the Pentagon, working conditions at Amazon warehouses that allegedly preclude bathroom breaks and have put people in the hospital, and the various outcomes of the “Walmart Effect”.

The Buy Local movement is poised in time at this interesting moment, in which our democracy gets to choose. Gigs or unions? Know your robot or know your farmer? Convenience or compassion? Is it either/or? Can it be both?

Both big and small brands have a major role to play in answering these timely questions and shaping the ethics of our economy. Big brands, after all, have tremendous resources for raising the bar for ethical business practices. Your agency likely wants to serve both types of clients, but it’s all to the good if all business sectors remember that the real choosers are the “consumers”, the everyday folks voting with their dollars.

I know that it can be hard to find good news sometimes. But I’m hoping what you’ve read today gifts you with a feeling of optimism that you can take to the office, take to your independently-owned local business clients, and maybe even help take to their communities. Spark a conversation today and you may stumble upon a meaningful competitive advantage for your agency and its most local customers.

Every year, local SEOs are delving deeper and deeper into the offline realities of the brands they serve, large and small. We’re learning so much, together. It’s sometimes a heartbreaker, but always an honor, being part of this local journey.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Artificial intelligence for marketers

Posted by on Jul 17, 2018 in SEO Articles | Comments Off on Artificial intelligence for marketers

Artificial Intelligence (AI) has blown up in the past few years and is quickly starting to take over the world of business marketing. From digital assistants like Alexa and Google Home to Siri and search algorithms, AI allows consumers to access the information they want quickly and efficiently.

It’s predicted that the world of AI is only going to continue to grow until it’s incorporated into most aspects of our business and personal lives. Biz Journals anticipates that “62% of enterprises will use AI technologies by 2018”, which has increased from the 38% of businesses that were using it in 2017.

Freeing up marketers’ time

Some are afraid that AI is going to take jobs away from marketers by performing tasks usually carried out by humans, but this isn’t the case.

Rather, AI is able to quickly carry out a lot of the time-consuming, tedious tasks previously required by small business owners so that their time can be freed up to focus on more in-depth tasks that require a human’s level of personalization.  Let AI handle tasks like recommendations and customer service so that marketers can focus on being creative and developing imaginative, engaging campaigns – something that AI is definitely not capable of doing.

Customer service is one area in which AI – specifically chatbots – should absolutely be used. In 2018 consumers expect to have their questions and concerns answered immediately, any hour of the day and day of the week, and this is a demand that humans can’t possibly meet, but robots can.

In addition to freeing up time for marketers, AI is accelerating marketing and sales. explains that by “giving robots access to [your] brand you’re giving consumers the same access”.  Embrace this thought and give AI opportunities to expand on your marketing efforts. Consumers are using AI in their search efforts in order to find what they’re looking for faster than they ever have before. Marketers need to make sure their content is optimized to meet these demands.

AI will be able to increase brand sophistication by analyzing copious amounts of consumer information. It uses machine learning to anticipate what a customer wants and needs faster than any human is able, and in turn this increases brand engagement and sophistication, and promotes customer loyalty.

The fact of the matter is that humans just don’t have the capability to access and analyze the huge amounts of customer data that AI can go through in a matter of seconds.

Data analysis

In order to truly reap the benefits of AI, it must be used correctly. AI should be used to deliver highly personalized and relevant messages. Consumers don’t want to feel like they’re being marketed to by a robot, even if that is the reality of the situation.

Generation Y and Z consumers want “a truly personalized experience [on websites] and within messages”. In the past, businesses could develop big marketing campaigns that appealed to huge amounts of people and they were wildly effective; but those days are gone.

Chatbots and virtual assistants are ‘the face’ of AI marketing and should be used accordingly. Although AI in marketing is more than just digital assistants, products like Alexa and Siri get most of the attention from businesses and from consumers. makes a good point in saying this could potentially be because these assistants act like humans, and even though consumers love computers and all their capabilities, they still want to feel like they’re interacting with a human. Remember this when crafting your AI marketing efforts and make sure your campaigns have that human element to them.

Also keep in mind that search queries using digital assistants are only supposed to increase in the coming years, so make sure your mobile site is optimized to meet this demand.

Social media marketing

AI can also be used in social media marketing. It’s already been used for targeting advertising, but AI can do so much more in the field of social media to help followers connect and engage with brands.

AI can quickly scan through social media content, data, and user history to help marketers create more relevant content.  Facebook uses AI extensively to do things like automatic face tagging in photos to determine which stories show up in a user’s newsfeed.

Despite how much better AI can make the user experience on social media platforms, many companies are still hesitant to incorporate it into their own social media marketing efforts. Hopefully this article has shown that instead of taking away jobs from marketers, AI will be able to free up some time so that they can focus on tasks that need a human element to them instead of drowning in data and mundane chores.


Have you incorporated AI into your business marketing plan?  How do you anticipate AI impacting your company’s marketing goals?  Comment in the section below!  Also, for more information on how AI is projected to impact marketing in the future, check out this article by Search Engine Watch.


Amanda DiSilvestro is a writer for No Risk SEO, an all-in-one reporting platform for agencies. You can connect with Amanda on  LinkedIn, or check out her content services at

An Evolution of SEO Techniques – What’s in and what’s out

Posted by on Jul 16, 2018 in SEO Articles | Comments Off on An Evolution of SEO Techniques – What’s in and what’s out


Like how the world wide web has been rapidly growing since its creation, the same could be said on SEO.

What might be working 5 years ago might have lost half of its credibility today. With the one and almost only search engine giant Google pumping out new lines to its algorithm every day, you can only imagine that all SEOs are kept on their toes all the time.

Here I have compiled a list of SEO techniques that you may or may not have heard of before. Let’s look at them together and you’ll see whether they are in or out.

An Evolution of SEO Techniques – What’s in and what’s out

A long long time ago when Google still wasn’t the total dominator of the internet world, nor was their algorithm as complex and fine-tuned as now, the keyword was literally THE key. It was the only SEO technique. Searches are answered by a list of websites that literally matches the word used to make the query.

If you look for “red sneakers” the search engine will give you results of pages most heavily loaded with “red sneakers”. Noticing this, smart online marketers started stuffing as many keywords as they can into a web page so they will rank.

The extremities that they would go to includes making sentences like this: Red sneakers dot com is the best place to shop for a variety of red sneakers including red sneakers for boys and girls you can absolutely get your desired red sneakers on red sneakers dot com.

Well isn’t that an eyesore. What’s better (or worse, really) is they even styled the keywords in a way where they will blend into the background of the web page so it’s not visible to visitors. But still visible to the crawlers so they can be ranked for the keyword.

That way they will be ranked for whichever keyword they want, as long as they stuff as much keyword as they can into a web page.

Google, of course, noticed this unethical practice and put an end on it by updating their algorithm and giving out penalties accordingly.

But, here’s the point, keyword still weighs heavily for ranking. How else can a user get what they want if they are not utilizing keyword? A query is, in its essence, done by typing in a word or a phrase. So how can a page rank for a keyword without keyword stuffing?


Get on that keyword research action!

Keyword is the core of SEO, keyword also plays a big part in Google ranking. Now that keyword stuffing is out of the way, how can you rank?

Also known as keyword optimization, this SEO technique is the way to get ranked for a keyword while simultaneously not pissing Google off. This is the leveled up technique to utilize keywords.

Keyword research is done by conducting research (surprise!) and analyzation. Then only selecting the best keywords to target. Every niche has a specific set of keywords that get more search volume, or drives more traffic or converts the most. But how can we know what are the keywords? There’s no shortcut to this. You need to do some in-depth research.

Keyword is the essence of SEO, choosing a relevant keyword can positively make or break a business. Too general? You’ll get buried in thousand other generic websites. Too specific? You might not get enough visitors to sustain.

The most straightforward way is to search for the niche that you’re targeting and see what the high ranked pages are. Then start analyzing them, stripped them off to see which keywords they are using.

You can also use the help of some tools like Google Keyword Planner or Google Trends to get an idea on how a certain keyword will perform or how relevant it is.


Remember keyword stuffing? An even uglier version of keyword stuffing is using irrelevant keywords. This is the kind of SEO technique that you should definitely avoid.

Web pages want to rank. They also want to rank for as many keywords as possible. During the good old days when SEO was more straightforward and simple. When keyword stuffing as a SEO technique still works like a charm. The marketers figured out they can also utilize irrelevant keywords.

Combined with the technique of making keywords invisible to visitors, they started stuffing different keywords on a single page.

Now red sneakers dot com will not only rank for red sneakers. They will also rank for best camping tent, walkman for sale, sushi restaurant in LA. This list of keywords doesn’t even make sense right? So are they to the innocent users. Imagine searching for a sushi restaurant in LA but be directed to red sneakers dot com. Not even JAPANESE sneakers but RED sneakers.

You can imagine how unhappy the users were. Neither was Google impressed, so this kind of practice is heavily criticized and frowned upon. Those who dare to try now will be at the receiving end of Google’s merciless wrath.

IN: LSI Keyword

Instead of trying to rank for multiple keywords that may or may not be related to your niche. Targeting a few selected keywords while adding in LSI keywords in the pile will work much better.

What is LSI Keyword? LSI stands for Latent Semantic Indexing.

Along the way, Google figured out that instead of focusing on keywords solely, they should also start actually understanding the content of a website. What happens is instead of a single keyword, they starting associating a topic keyword to a group of associated keywords. This gives birth to a brand new SEO technique.

Let’s look at an example. The LSI keyword for “dog” could be “dog facts”, “dog pictures”, “dog flu”, “dog for sale” etc.

LSI keywords for dog

Related searches to the word dog shows you what Google thinks are the keywords related to “dog”

So the short answer for what is LSI keyword is, whatever other keywords that Google thinks is related to your main keyword. From their point of view, if you’re talking about “dog”, it’s only natural that you might start talking about “dog for sale”.

By analyzing whether your web page is laden with LSI keywords instead of looking at how many times a keyword is used. Google can figure out (as best of their ability anyway) how relevant your web page is to a certain topic. Therefore influencing your Google rank.


No dear, it doesn’t work that way.

Now that the search engine giant notice that keyword is becoming less reliable of a reading to provide users with the relevant search results, they come up with another aspect, links.

The reasoning behind link is: a website would not link to another website voluntarily if they don’t find them relevant, informative or trustworthy. Based on this reasoning, Google started using link volume as a ranking criteria.

However, as always, we can never have nice things. Because people started abusing it, again. How? They come up with link farms, selling links, trading links. Whatever way they can think of to abuse links to get a higher rank. This is the kind of SEO technique that you should stay away from.

According to the almighty Google buying links means: “exchanging money for links, or posts that contain links; exchanging goods or services for links; or sending someone a “free” product in exchange for them writing about it and including a link.”

If you think you wouldn’t mind trading a couple hundred dollars for a link, and a little push up the SERP, think again.

A link is much more complicated than it seems because of PageRank. What this algorithm do is, each link has different values based on a couple criteria. From page authority, page content relevancy, to how many outward links they have.

So you can imagine that buying link isn’t really as efficient as hand in the money and get out with a rank 1 web page. You might also never get out of Google’s blacklist.


Now that it is certain that buying links will not only not work. You also risk getting penalized for being caught doing something that Google clearly frowned upon.

So how can you get that link juice that is clearly important for ranking? Short answer: effort and sweat and blood. (not real blood but you get the idea)

As far as Google’s reasoning goes, a link is hard to fake because people have to be willingly linking to you only if they find your content worth sharing, informative and authoritative. So try to be all those.

Invest time and effort into producing quality content that is relevant to your market and consumers. It might take some time, but people will start to notice and slowly you will get a link here and there.

Hey, Rome wasn’t built in a day neither is a rank 1 page. Instead of spending money on buying links it’s better to spend money on creating good content. At least honest work won’t get you penalized by Google and wiped out from the index.


SEO goes hand in hand with content. White hat SEO goes with good content, black hat SEO goes with lazy contents. How do they do that? By duplicating others’ hard work.

A duplicate content means a content that appears in multiple URLs in the World Wide Web.

A duplicate content is tricky for search engines because they don’t know:

which one is the original,
Which one to rank,
whether they should be treated as different version or not

Google is smart, but things like this can still confuse them. So marketers will try to plagiarize a content and hope to get ranked for the piece. Hey, isn’t that stealing? You might ask, and yes it is. Google rolled out the Panda update to counter unethical SEO technique like this. However, if you come across someone plagiarizing your stuff, you can file a complain to have it taken down.

Other than duplicated pages, pages with little value that is laden with keywords, or scrapers who scrape work of others and publishing it as their own are also considered thin content.

These tactics might have worked in the past, but the era is now long gone.


Not just any content… but quality content!

For SEO and inbound marketing, quality content is king. Pumping out quality, relevant, and creative contents consistently is one of the best SEO tactics that you can employ.

You might have noticed that a lot of e-commerce websites come with a blog page. Other than being a platform to publish informative and insightful articles, they also act as a magnet to incoming traffic volume and search ranking.

Now if you have a website selling baby products, it’s only natural for you to publish articles related to it. Be it the best time to start feeding the baby solid food. Or the how to make sure your house is childproof. Because those are the topics that interest your potential customers.

With each article published, you are building up trust and a sense of authority with your visitors. When they want to do a purchase, they are more likely to buy from you. Because in their eyes you are authoritative of your niche and they trust you on your expertise.

Remember keyword research and LSI keywords? Every time you publish an article, you are adding in keywords and LSI keywords to your website. Making it more and more relevant to your niche. Paired up with keyword research, you can consistently put out quality blog posts that are relevant and interest your potential customers. Therefore, turning them into actual customers.

To put in shortly, quality contents can: attract visitors, boost SERP ranking, build authority and trust, converts.

Tempted to start blogging on your e-commerce site yet? Like Nike said, just do it.


How cloaking works.

Want to know one of the worst SEO technique that will definitely get on Google’s bad side? The answer is cloaking.

What is cloaking? Like the word itself suggested, cloaking means covering something up to hide what it actually is.

This is a technique where they show the search engine one version of the website, but another to the visitor. They are covering themselves up and hiding their true form from the search engine crawlers.

Why would someone do that? Mostly to deceive the search engine to get a higher rank. With a higher rank, the website can attract more audience. But you can imagine the user experience will be terrible. Since they are being deceived to visit a website that does not host the content that they claim they do. Which totally defeats Google’s purpose of catering to what the users want.

Google by no means tolerates any form of cloaking. It is considered a violation of Google’s Webmaster Guidelines. You might get a boosted ranking for a few weeks, but is it worth it when your website got penalized and wiped off? Nope with a capital N.


Unlike cloaking that hides from the search engine, works the totally opposite way. You are telling the search engines exactly what the page is down to exactly what each line on the page is. is the brainchild of Google, Bing, Yandex, and Yahoo! Seeing all these big names in the same sentence you can tell that Schema is THE thing.

These search engines came together to help you correctly tag your web pages. That way search engines can display your pages accurately and more specifically on the SERP. The more specific your markup is, the easier it is to matches with user search intent. Therefore providing a better user experience which is what the search engines rave to cater.

Because of the complexity of a web page, it is difficult for crawlers to correctly understand each phrase according to its intend. By correctly utilizing schema you are basically telling the search engine what is what.

Whether your page is an article or news, is it about tourism or fashion, who is the author etc. There is a lot of things that can be markup to tell the search engine exactly what it is. Therefore giving the visitors a much more accurate and relevant portrayal.

Schema markup could be easily added to the HTML so you have no excuse to not utilize it. Google even has a tool to help you do just that.

Note that the whole schema is much more comprehensive and complex. The complete list of property can be viewed on the website.

A part of the list on blog properties and their schema markup tags.

Now that you have a general idea of these 10 SEO techniques, I trust you to have a good judgment on which techniques to employ. At the end of the day, remember that sticking to Google’s Webmaster Guideline will forever be the correct answer.

float: left;
margin: 0;
width: 100%;
max-width: 654px;
height: 100%;
#optin-template-3 .container{
float: left;
width: 100%;
height: 100%;
text-align: center;
background: #fff;
border: 0px solid d64a37;
padding-bottom: 16px;
#optin-template-3 .top-row{
display: inline-block;
width: 88%;
padding: 3% 6% 0%;
#optin-template-3 .top-row h2{
margin: 5px 0 0;
font-family: “roboto”, helvetica, sans-serif;
font-weight: 600;
text-align: center;
padding:0px 0px 5px;
#optin-template-3 .left-column{
display: inline-block;
width: 100%;
max-width: 270px;
min-width: 270px;
height: 100%;
vertical-align: top;
padding-top: 32px;
#optin-template-3 .ebook-img{
width: 100%;
height: 310px;
background: url(;
background-size: cover;
#optin-template-3 .right-column{
display: inline-block;
width: 60%;
min-width: 250px;
max-width: 305px;
padding: 24px 4% 32px;
#optin-template-3 .bodycopy ul{
text-align: left;
padding-left: 0;
#optin-template-3 .bodycopy ul li{
font-family: “roboto”, helvetica, sans-serif;
margin-left: 20px;
#optin-template-3 .optIn-form{
display: block;
bottom: 0;
#optin-template-3 .email{
display: block;
width: 100%;
border: 0;
padding: 8px 0;
font-size: 18px;
text-align: center;
border: 1px solid #d64a37;
#optin-template-3 .submit-button{
display: block;
margin-top: 4%;
width: 100%;
padding: 8px 0;
font-family: “roboto”, helvetica, sans-serif;
font-weight: 400;
color: #fff;
background: #d64a37;
font-size: 21px;
border: 0;
outline: 1px solid #d64a37;
cursor: pointer;

Deadliest SEO Mistakes
Revealed By Google Patent

“While not being a Black Hat is a no brainer, this book lets you know how not to let Google mistook you as one.” – Evander Wilmer
Cracking Methods Used By Google To Detect Black Hat Practices
4 Big Companies That Got Caught At Rank Manipulation
How You Can Avoid Being Penalized By Mistake

Why I Spent $500,000 Buying a Blog That Generates No Revenue

Posted by on Jul 16, 2018 in SEO Articles | Comments Off on Why I Spent $500,000 Buying a Blog That Generates No Revenue


(If you are wondering, the image of me above was taken when I used to work at KISSmetrics with Hiten Shah… I used to have hair)

In early January 2017, I purchased the KISSmetrics website for $500,000.

If you go to the site, you’ll notice that it forwards here to (which I will get into later in the post).

The $500,000 didn’t get me the company, KISSmetrics, or any of the revenue streams. The parent company, Space Pencil, is continually improving and developing the product.

And on top of that, there are restrictions. I can’t just pop up a competing company or any company on the KISSmetrics site.

So why did they sell me the domain? And why would I pay $500,000 for it?

I can’t fully answer why they sold it, but I do know a lot of their customers came from word of mouth, conferences, paid ads, and other forms of marketing that didn’t include SEO or content marketing.

For that reason, the domain probably wasn’t as valuable to them as it was to me. And of course, who wouldn’t want extra cash?

I’m assuming they are very calculated because they are an analytics company, so they probably ran the numbers on how much revenue the inbound traffic was generating them and came to the conclusion that the $500,000 price tag seemed worth it.

Now, before I get into why I spent $500,000 on the domain, let me first break down my thought process as I am buying out a lot of properties in the marketing space (more to be announced in the future).

Why am I buying sites that aren’t generating revenue?

This wasn’t the first or the last site that I’ll buy in the space.

I recently blogged about how I bought Ubersuggest. And it wasn’t generating a single dollar in revenue.

Well technically, there were ads on the site, but I quickly killed those off.

And eventually, I ported it over to

When I am looking at sites to buy, I am only looking for 1 thing… traffic. And of course, the quality (and relevancy) of that traffic.

See, I already have a revenue stream, which is my ad agency, Neil Patel Digital.

So, my goal is to find as many sites that have a similar traffic profile to and leverage them to drive my agency more leads.

How do you know you won’t lose money?

I don’t!

This approach doesn’t guarantee I’ll make more money.

I look at the business as tons of tiny experiments. You don’t build a huge business through one simple marketing strategy or tactic.

You have to combine a lot of little things to get your desired outcome.

And sometimes you’ll make mistakes along the way that will cost you money, which is fine. You have to keep one thing in mind… without testing, you won’t be big.

With my ad agency, we tend to mainly have U.S. clients. Yes, we serve other regions as well… for example, we have an ad agency in Brazil.

But I myself mainly focus on driving traffic to the U.S. ad agency, and the other teams just replicate as I don’t speak Portuguese, German, or any of the required languages for the other regions we are in.

So, when I buy companies, I look for traffic that is ideally in the U.S.

Sure, the ad agency can work with companies in Australia, Canada, and even the United Kingdom, but it’s tough.

There’s a huge difference in currency between Australia and the U.S. and the same goes for Canada.

And with the U.K. there is a 5 to 8-hour time zone difference, which makes it a bit more difficult to communicate with clients.

That’s why when I buy a site, I’m ideally looking for U.S. traffic.

When I bought Ubersuggest it had very little U.S. traffic. Indonesia and India were the two most popular regions.

But I bought it because I knew I could build a much better tool and over time grow the U.S. traffic by doing a few email blasts, getting on Product Hunt, and by creating some press.

And I have…

As you can see from the screenshot above, U.S. is the most popular region followed by India and Brazil.

Over time it shouldn’t be too difficult to 3 or even 4x that number as long as I release more features.

Now, my costs on Ubersuggest have gotten into the 6 figures per month, and I am not generating any income from it.

There is no guarantee that it will generate any revenue, but I have a pretty effective sales funnel, which I will share later in the post. Because of that sales funnel my risk with Ubersuggest is pretty low.

As long as I can grow the traffic enough, I should be able to monetize.

What about KISSmetrics?

As for KISSmetrics, I mainly bought the domain for the blog traffic.

During its peak it was generating 1,260,681 unique visitors per month:

By the time I bought the blog, traffic had dropped to 805,042 unique visitors per month:

That’s a 36% drop in traffic. Ouch!

And then to make matters worse, I decided that I wanted to cut the traffic even more.

There were so many articles on KISSmetrics that were outdated and irrelevant, so I had no choice but to cut them.

For example, there were articles about Vine (which Twitter purchased and killed), Google Website Optimizer (no longer exists), Mob Wars (a Facebook game that no longer exists)… and the list goes on and on.

In addition to that, I knew that I could never monetize irrelevant traffic. Yes, more traffic is good, but only as long as it is relevant.

I instantly cut the KISSmetrics blog in half by “deleting” over 1,024 blog posts. Now, I didn’t just delete them, I made sure I added 301 redirects to the most relevant pages here on

Once I did that, my traffic dropped again. I was now sitting at 585,783 unique visitors a month.

It sucks, but it had to be done. The last thing I wanted to do was spend time and money maintaining old blog posts that would never drive a dollar in revenue.

I knew that if someone was going to come to my blog to research Vine, there was little to no chance that the person would convert into a 6-figure consulting contract.

After I pruned and cropped the KISSmetrics blog, I naturally followed the same path of Ubersuggest and merged it in to

The merge

The KISSmetrics merge was a bit more complicated than Ubersuggest.

With Ubersuggest, I didn’t have a keyword research tool on, so all I had to do was slap on a new design, add a feature or two, and port it over.

With KISSmetrics, a lot of the content was similar to For the ones that were similar, I kept the version considering this blog generates more traffic than the KISSmetrics one.

As for all of the content that was unique and different, I ended up moving it over and applying 301 redirects.

If I decided to skip the pruning and cropping stage that I described above, the KISSmetrics blog would have had more traffic. And when I merged it in with I would have done even better.

But in marketing you can can’t focus on vanity metrics like how many more unique visitors you are getting per month. You need to keep your eye on the prize.

And for me, that’s leads.

The more leads I generate for my ad agency, the more likely I’ll increase my revenue.

Here’s my lead count for the weeks prior to the KISSmetrics merge:

When looking at the table above, keep in mind it shows leads from the U.S. only.

The KISSmetrics blog was merged on the 25th. When you add up all of the numbers from the previous week, there were 469 leads in total, of which 61 were marketing qualified leads.

That means there were 61 leads that the sales reps were able to contact as the vast majority of leads are companies that are too small for us to service.

When you look at the week of the 25th, there were a total of 621 leads. 92 where marketing qualified leads.

Just from that one acquisition, I was able to grow my marketing qualified leads by 50.8%. 🙂

I know what you are thinking though. The week after the 25th (7/2) the leads tanked again. Well, you have to keep in mind that the table only shows leads from the U.S. and during that week there was a national holiday, the 4th of July. So, leads were expected to be low.

But still, even with the holiday, we generated 496 leads, 68 of which where marketing qualified. We still generated more marketing qualified leads than when we didn’t have the KISSmetrics traffic.

The early results show that this is going to work out (or so I hope). If you ever want to consider buying up sites that aren’t generating revenue, you need to know your numbers like the back of your hand.

My sales funnel

Some of you are probably wondering how I promote my agency from this site. As I mentioned earlier, I will share my funnel and stats with you.

The way I monetize the traffic is by collecting leads (and my sales reps turn those leads into customers).

On the homepage, you will see a URL box.

Once you enter a URL, we do a quick analysis (it’s not 100% accurate all of the time).

And then we show you how many technical SEO errors you have and collect your information (this is how you become a lead).

And assuming we think you are a good fit, you see a screen that allows you to schedule a call (less than 18% of the leads see this).

From there, someone on my team will do a discovery call with you.

Assuming things go well, a few of us internally review everything to double check we can really help, we then create projections and a presentation before pitching you for your money (in exchange for services of course).

That’s the funnel on in a nutshell… It’s pretty fine-tuned as well.

For example, when someone books a call we send them text reminders using Twilio to show up to the call as we know this increases the odds of you getting on the phone.

We even do subtle things like asking for your “work email” on the lead form. We know that 9 out 10 leads that give us a Gmail, Hotmail, AOL, or any other non-work email are typically not qualified.

And it doesn’t stop there… there are lead forms all over for this same funnel.

If you are reading a blog post like this, you’ll see a bar at the top that looks something like:

Or if you are about to exit, you will see an exit popup that looks like:

You’ll even see a thank you page that promotes my ad agency once you opt-in:

And if I don’t convince you to reach out to us for marketing help right then and there, you’ll also receive an email or two from me about my ad agency.

As you can see, I’ve fine-tuned my site for conversions.

So much so, that every 1,000 unique visitors from the U.S. turns into 4.4 leads. And although that may not seem high, keep in mind that my goal isn’t to get as many leads as possible. I’m optimizing for quality over quantity as I don’t want to waste the time of my sales team.

For example, I had 2 reps that had a closing ratio of 50% last month. That means for every 2 deals they pitched, 1 would sign up for a 6-figure contract, which is an extremely high closing ratio.

Hence, I am trying to focus on quality so everyone in sales can get to 50% as it makes the business more efficient and profitable.

The last thing you want to do is pay a sales rep tons of money to talk to 50 people to only find 1 qualified lead. That hurts both you and your sales reps.


The strategy I am using to buy websites may seem risky, but I know my numbers like the back of my hand. From an outsider’s perspective it may seem crazy, but to me, it is super logical.

And the reason I buy sites for their traffic is that I already have a working business model.

So, buying sites based on their traffic is much cheaper than buying sites for their revenue. In addition to that, my return on investment is much larger.

For example, if I wanted to buy KISSmetrics (the whole business), I would have to spend millions and millions of dollars.

I’m looking for deals, it’s how you grow faster without having to raise venture capital.

When you use this strategy, there is no guarantee you will make a return on your investment, but if you spend time understanding the numbers you can reduce your risk.

I knew that going into this KISSmetrics deal that I will generate at least an extra $500,000 in profit from this one acquisition.

Realistically it should be much more than that as the additional leads seem to be of the same quality, and the numbers are penciling out for it to add well into the millions in revenue per year.

But before you pull the trigger and buy up a few sites in your space, there are a few things you need to keep in mind:

Don’t buy sites that rely on 1 traffic source – you don’t want to buy sites that only have Facebook traffic. Or even Google traffic. Ideally, any site you buy should have multiple traffic sources (other than paid ads) as it will reduce your risk in case they lose their traffic from a specific channel.
Buy old sites – sites that are less than 3 years old are risky. Their numbers fluctuate more than older sites.
Spend time understanding the audience – run surveys, dive deep into Google Analytics… do whatever you can to ensure that the site you are buying has an audience that is similar to your current business.
Be patient and look for deals – I hit up hundreds of sites every month. Some people hate my emails and won’t give me the time of day. That’s ok. I’m a big believer and continually pushing forward until I find the right deal. I won’t spend money just because I am getting antsy.
Get creative – a lot of people think their site is worth more than it really is. Try to explain to them what it is really worth using data. I also structure deals in unique ways, such as I gave KISSmetrics up to 6 months before they had to transition to a new domain (and to some extent they are still allowed to use the existing domain for their client login area). You can even work out payment plans, seller based financing, or equity deals… you just have to think outside the box.

So, what do you think about my acquisition strategy? Are you going to try it out?

The post Why I Spent $500,000 Buying a Blog That Generates No Revenue appeared first on Neil Patel.

What is AMP Project: A Breakdown

Posted by on Jul 16, 2018 in SEO Articles | Comments Off on What is AMP Project: A Breakdown



AMP stands for Accelerated Mobile Contents. It is a website building framework aimed to optimize web pages for mobile views by creating lightweight pages and ensuring a fast load time on users end.

The open source AMP Project was announced on October 7, 2015. First unveiled to mobile internet users in February 2016 via the top stories session in Google search. And finally, in September users are being served actual AMP pages. Backed by Google and many more tech giants, AMP is taking the center stage as a standard for web pages optimization.

Since its birth, AMP web pages has been heavily featured in mobile Google searches. With the top stories carousel feature highlighting solely AMP pages. This heavy endorsement shows that the tech engine giant prefers the AMP format as the standard mobile format.

The top stories carousel featuring solely AMP pages that are marked with a lightning bolt.

Google has been actively advocating load time as a critical aspect of user experience. The AMP framework addresses what they deemed is bloating up web pages and set up restrictions. The end result achieved is a web page that is six times lighter than their HTML counterpart. There’s no JavaScript and no CSS. If you want those things, you need to play it the AMP way.

Now for the user’s end, all AMP pages are labeled with a lighting bolt signal on mobile searches. A top stories carousel consists of only AMP pages is presented on the top part of SERP. The users need to only scroll from left to right to access a list of articles related to their query. AMP pages listed in the query is also preloaded, creating an experience of instant load time when users click on the AMP page. All in all AMP pages make a pretty sweet user experience.


The main selling point of a web page build on AMP is how fast and light it is. Google and Twitter and even eBay advocate AMP because of this. Speed makes better user experience, better user experience translates to better conversion rate. To be honest, every business answers to money and revenue, and that’s Google’s way of ensuring publishers get it.

Boasting a median load time of 0.5 seconds from Google Search. It does seem AMP is holding up their end of the bargain of being lightning fast. As a rule of thumb, any and all web pages should load completely in 8 seconds. While mobile pages should load in under a second. 53% of mobile users abandon a web page if it takes more than 3 seconds to load.

The users want their pages fast and AMP is the answer from Google. There are a number of case studies published on the AMP website. Giving it a look, after adopting AMP all of them states that their load time is at least 2 times faster. Some websites even speed up by 8 times. This shows how powerful the AMP can be in terms of speed.

Pleased with high load speed, users spend twice the time on an AMP page. Here comes the most important part, e-commerce sites experience a 20 percent increase in sales conversion. Which proves that good user experience does translate to better conversion rate.

How do they achieve this forced diet plan on bloated web pages? They stripped them off.

“One thing we realized early on is that many performance issues are caused by the integration of multiple JavaScript libraries, tools, embeds, etc. into a page,” said Malte Ubl, AMP Project Team Lead.


Not everyone is happy with the AMP project and sings praises on it like its an angel descending though. To be clear, the harshest backlash happens to the way the project team is handling AMP pages instead of the actual framework itself.

People argue that AMP is Google’s way of reaching internet monopoly. On January, A Letter About Google AMP is published. Publicly addressing the concerns of internet activists, engineers, corporations and users alike.

A letter about Google AMP gained a little shy from 700 signatories.

Let’s have a look at the two points of arguments.

1. AMP participants are granted preferential search promotion.

2. Unbeknownst to users, they are staying in the Google ecosystem when they visit a piece of AMP content disregarding the publisher.

There is no denying that AMP pages are in fact, given preferential in mobile searches. Made obvious by the top stories carousel of AMP pages which appears at the top area of the SERP. Major publishers like CNN who adopted the AMP format even has a personal carousel of AMP articles in their search result rich snippet.

One thing to be taken into consideration, the top results displayed in the SERP remains a mix of AMP and non-AMP pages.

SERP with a mix of AMP and HTML results.

Meaning the actual search ranking is not affected nor dominated by the emergence of AMP pages. What you need to know is, a disruption appears at the traditional top results in the SERP with the AMP top stories carousel being highlighted.

My take on it? The SERP has not been the list of 10 blue links on a white backdrop for a long long time now. The SERP nowadays is much more dynamic and personally tailored. Google as a company, endorsing and pushing their product sounds only fitting. However, considering the fact that Google has already been dominating the search engine for a decade, that can be an overkill.

Google argues the AMP project is not their own, instead, it is a collaboration brainchild with multiple tech giants. For me, the point is, one cannot deny that the AMP project has been crowned the Google prefix more often than not. The AMP project and Google are closely tied, that’s a fact. To what extent? That is the question Google needs to answer.

Next, the second concern stems from the fact that AMP pages are cached on the Google CDN and the cache copy is distributed to users. When a user clicks on a preloaded search result AMP site, they are viewing it in the AMP viewer, on the search page itself. Are the users aware of this? Are you aware of this? That is the question.

Restlessness arises as people think Google is effectively stripping the content away from the creators and publishers. They feel a lack of control over the user experience aspect of their AMP pages since the users are given the Google copy of it.

What’s more, if you pay attention to the URL of an AMP page, instead of displaying the origin publisher’s URL, you will see a google URL. This further enrages contents creators alike. As from their point of view, the users never exited Google to view their creations. Instead, the users are being shown their web page inside Google, masked by a Google URL.

The AMP project team’s answer to the second concern is the new update published in May. Publishers can now choose to sign a bundle HTTP exchange which will then be distributed by the browser. That way, users will be shown the actual publisher’s URL. A new set of tools is also made available for programmers to implement the exchange.

On AMP’s defense, even though they break the URL system the actual distribution link is not broken. All the views and revenues of an AMP page go to the publishers instead of Google. Unlike what they are accused of. But breaking the URL system itself is a big no no. Everyone is aware of it, they are aware of it.

Will the signed exchange resolve the issue and tension? We’ll have to see.


The advertisement is one of the integral parts of the modern internet ecosystem. Tell me one single website without ads. If you do find one, it is because the whole website itself is an advertisement.

People get very creative with ads. Because ads get them revenue, revenue means money, people go absolutely all out if they can get money.

The problem with such a streamlined framework as the AMP is the restrictions. The AMP project sets up a lot of restrictions on what could be written. Thus restricting how creative the ads can be. No more pop-ups, double pop-ups, banners on the top, bottom, left and right. No more waiting 5 seconds to click ignore this ad to reach the page I want.

Everything is streamlined, every web pages are blocks of scrollable text and pictures, no tricks. To keep the AMP pages as light as possible, third-party JavaScript or complex HTML are not allowed. This means more advanced functions including tracking and analytics are not happening.

An advertisement featured in a Wired AMP article

Instead, ads are displayed in the block just like any others. Note, ads are loaded later than the content itself. To address the issue of ads loading up noticeably later, the AMP project comes up with amphtml ad which is supported by a handful of ad publishers.

The verdict, advertisements can absolutely be run in AMP pages. However, there are fewer placement options. Abiding by the AMP rules of HTML subsets, ads could be loaded just as lightning fast. Overall giving a better user experience without sacrificing the money maker.


Ever heard of Snapchat stories? What about Facebook stories? Now the newest stories are served up by AMP.

The AMP project announced the AMP stories feature in February. What they are, are basically a snapchat-explore-esque stacked cards. Users could easily browse these featured AMP stories in search. Think of it as a slide-show, but remastered, lightweight and instant. More importantly, endorsed by Google and their major publisher partners.

Although viewable on both desktop and mobile, the AMP stories only show up in mobile search. What’s more, with further inspection on the publisher’s website, I can’t find any links to the AMP stories.

The AMP stories could be about a wide range of topics. From “The Prince Harry You Don’t Know” to “Walking Through Raqqa’s Rubble”. Sky’s the limit for writers and creators alike to create a compelling story. Words and pictures stringed together in a stack of cards.

The AMP story on Prince Harry by CNN

A spread about the Raqqa Campaign in Syria.

On mobile, while searching for The Washington Post (one of the collaborators of AMP stories), I am greeted with a “top visual stories from” carousel. These eight visual stories each features a unique story. While browsing through I came over one card in the stack featuring (surprise!) an ad.

Since the AMP stories feature has only been introduced three months ago, it is pretty much still in the experimental stage. How ads will function is not set and we shall anticipate further updates from the AMP project team.

With how less time and effort people are willing to spend on gaining casual information, we can only imagine the format to be fitting.


As a Firefox user on android, I have not noticed an influx of AMP articles. Switching to the Chrome browser though, I am greeted with a dynamic and AMP highlighted search page. AMP pages also show up in iOS searches using Safari.

There are no AMP pages on the SERP of my Firefox Android

The AMP project official site stated support on all major browsers including Chrome, Firefox, Edge, Safari etc. Therefore I am baffled by the total lack of AMP pages search results in my Firefox browser. Or are they just not labeled as such?

More often than not, users are not aware of AMP. They either do not notice the lighting bolt symbol and the difference in load speed. Or they are aware and simply disregarded it.

However, the whole main purpose of the AMP project is to enhance user experience. The project aims to provide fast and straight to the point content to users. What’s the point if the one being served doesn’t even know they are being served?


Being an AMP page is not a ranking factor. Responsive web design and fast load time is. You can see a search result that is filled with AMP pages or HTML pages or a mix of both, depending on what you’re looking for. The point is, AMP has not been totally dominating the SERP.

However, being the golden child of Google does mean something for SEO. We’ll see why.

Google has been talking about responsive web designs and load speed since the “mobilegeddon” in 2015. That is not without reason. By 2017 in North America alone, a total of 42.41% of the total internet traffic comes from mobile users. And the number are only increasing. Developers and tech companies need to adapt to this rapid change in user preference.

AMP targeted the mobile part of responsiveness while ensuring a fast load time. Therefore, having an AMP page basically boost your chance of ranking higher in mobile searches.

However you have to keep in mind that, as the name suggested, AMP is only for mobile. What’s more, it has not got full support on all web browsers yet.

Here is where Google comes in the picture. Chrome, the mobile web browser with more than half of the market share is another golden child of Google. Which means Chrome fully supports the AMP format.

Chrome is entirely compatible with AMP because they are both product of Google.

With the SERP becoming more and more dynamic. There are a lot of new elements being introduced to the search page. From Featured Snippet to People Also Ask, these features often overtakes the traditional blue links results. SEO for web pages should be aiming for these featured spots too.

Here’s the thing, being featured in top stories carousel might very well be another strategic placement like being the featured snippet. AMP pages give SEO another spot to target on other than being on the first page of SERP.

SEOs can take advantage of Google’s effort on advocating AMP. They have more ways to get spotlighted on the SERP compared to the traditional web pages.


Unlike enforcing a responsive design, you are creating a whole other page. An alternate version of your web page served only on mobile and supported browsers.

The functioning model of the project’s publishing partner is to maintain two pages. One of your run-of-the-mill HTML while another one in AMP. Take CNN, for example, each of their articles could be browsed in AMP. While in the desktop you will be served the normal HTML version.

A CNN article as viewed on desktop.

The same article in AMP format.

What this means is it demands programmers to constantly maintain two copies of a page.

Implementation can be time-consuming and expensive, especially for existing websites that are complex or huge. Despite the readily available learning and templates provided, creating an AMP copy of an existing site is not instant.

You have to remember, the AMP project is not an out of the box solution. If your website lacks responsiveness or speed, adapting AMP can solve that problem. But adapting the format itself will be the other problem.

The AMP framework needs customization for all the elements on a traditional website to work. John Mueller from Google had openly expressed that an AMP page should be as equivalent to the main page as possible. Considering the restriction on JavaScript and CSS this means a lot of modification work.

That’s one of the reasons why AMP has been viewed as a format that works best for articles. News articles, blog posts, basic blocks of texts work better in AMP because of the usual lack of interactive elements.


If you’re thinking about AMP-lifyng your web pages, here’s one last info that will spike your interest. Google has announced that they are looking to bring the AMP benefits to all web pages. That means, regardless of your web page’s format, you might get featured in the top stories carousel, among others.

Keep in mind though, standardizing a feature requires a lot of testing and a lot of time. I will not say, hey just give up on AMP it doesn’t matter anyway. No, they still matter in the way that they are given advantages in mobile searches.

Since I’m not the one deciding how your website should be, I shall leave you with a few questions instead. Hopefully you can make a decision after this.

1. How much time, effort and money you are willing to put in AMP-lifying your website.

2. How much do you think AMP will benefit your website comparatively (refer to question 1) speaking?

3. Will it be easier to just revamp your code to make it responsive?

float: left;
margin: 0;
width: 100%;
height: 100%;
#optin-template-1 .container{
float: left;
width: 100%;
height: 100%;
text-align: center;
#optin-template-1 .left-column{
display: inline-block;
width: 40%;
max-width: 235px;
min-width: 150px;
height: 100%;
vertical-align: bottom;
#optin-template-1 .ebook-img{
width: 100%;
height: 270px;
background: url(;
background-size: cover;
background-position: 50% 15%;
#optin-template-1 .right-column{
display: inline-block;
width: 70%;
max-width: 330px;
min-width: 250px;
padding: 20px 5% 30px;
#optin-template-1 h2{
margin: 0;
#optin-template-1 .optin-header{
#optin-template-1 .optin-header h2{
font-family: “roboto”, helvetica, sans-serif;
color: #32485f;
font-weight: 600;
font-size: 1.8em;
text-align: center;
#optin-template-1 .optIn-form{
display: block;
bottom: 0;
#optin-template-1 .email{
display: block;
width: 94%;
border: 0;
padding: 10px 3%;
margin-top: 8%;
margin-bottom: 12px;
font-size: 1em;
text-align: left;
background: #f0f0f0;
#optin-template-1 .submit-button{
display: block;
margin-top: 3%;
width: 100%;
padding: 8px 0;
font-family: “Arial”, helvetica, sans-serif;
font-weight: bold;
background: rgb(249, 206, 29);
font-size: 24px;
border: 1px solid rgb(249, 206, 29);
border-radius: 4px;
cursor: pointer;

Download The Complete Guide To Creating A Mobile-Friendly Website

Noindex a post in WordPress, the easy way!

Posted by on Jul 16, 2018 in SEO Articles | Comments Off on Noindex a post in WordPress, the easy way!


Some posts and pages should not show up in search results. To make sure they don’t show up, you should tell search engines to exclude them. You do this with a meta robots noindex tag. Setting a page to noindex makes sure search engines never show it in their results. Here, we’ll explain how easy this is in WordPress, if you use Yoast SEO. 

Want to bump your SEO to a higher level? Become a technical SEO expert with our Technical SEO training! »

$199 – Buy now » Info

Why keep a post out of the search results?

Why would you NOT want a page to show up in the search results? Well, most sites have pages that shouldn’t show up in the search results. For example; you might not want people to land on the ‘thank you’ page you redirect people to when they’ve contacted you. Or your ‘checkout success’ page. Finding those pages in Google is of no use to anyone.

Not sure if you should noindex or nofollow a post? Read Michiel’s post: Which pages should I noindex or nofollow?

How to set a page to noindex with Yoast SEO

Setting a post or page to noindex is simple when you’re running Yoast SEO. Below your post, in the Yoast SEO meta box, just click on the advanced tab:

On the advanced tab, you’ll see some questions. The first is: “Allow search engines to show this post in search results?” If you select ‘Yes’, you’re post can show up in Google. If you select ‘No’ you’ll set the post to noindex . This means it won’t show up in the search results.

The default setting of the post – in this case Yes – is the setting you’ve selected for this post type in the Search Appearance tab of Yoast SEO. If you want to prevent complete sections of your site from showing up in Google, you can set that there. This is further explained in Edwin’s post: Show x in search results?.

Please note that if the post you’re setting to noindex is already in the search results, it might take some time for the page to disappear. The search engines will first have to re-index the page to find the noindex tag. And do not noindex posts frivolously: if they were getting traffic before, you’re losing that traffic.

Were you considering to use the robots.txt file to keep something out of the search results? Read why you shouldn’t use the robots.txt file for that.

Do links on noindexed pages have value?

When you set a post to noindex, Yoast SEO automatically assumes you want to set it to noindex, follow. This means that search engines will still follow the links on those pages. If you do not want the search engines to follow the links, your answer to the following question should be No:

This will set the meta robots tonofollow, which will change the search engines behavior. They’ll ignore all the links on the page. Use this with caution though! In doubt if you need it? Just check Michiel’s post right here.

Read more: The ultimate guide to the meta robots tag »

The post Noindex a post in WordPress, the easy way! appeared first on Yoast.

Audience expansion and discovery: how to get ahead

Posted by on Jul 16, 2018 in SEO Articles | Comments Off on Audience expansion and discovery: how to get ahead


One of the reasons we love paid search is because it performs, but its intent-driven nature means it’s not the channel to build scale. The way to do that is get in front of relevant audiences and generate demand for your product/service. This is where channels such as paid social come into play, and one of the best channels to really hone in on targeting various audiences is Facebook.

The most obvious ways to get in front of relevant audiences on Facebook are:

Lookalikes – leveraging CRM lists to create audiences that look similar to your customers. Get more advanced by segmenting your customer list into groups of identifiable characteristics (e.g. high lifetime value, high average order value) and target lookalikes of those groups
Use demographic data and interests of your prime customer base, and target people based on what you already know.

If you’re a semi-sophisticated marketer, you’ve already targeted the most obvious audiences. So what’s next? How do you continue to scale and find more audiences? In this article, we discuss some of the ways you can move forward with finding additional, relevant audiences to test to help push performance and scale.

Poach from competitors

You should absolutely be testing and targeting audiences that like your competitors. They are highly relevant, and as a bonus, you may be able to steal market share from your rivals. For good insights, go into interest targeting on Facebook, input your competitor names, and dig in.

Use Audience Insights tools from Facebook and Google

Advertisers can always use more personas, so it’s helpful to figure out characteristics of relevant audiences that may help you recognize new folks to target.

In Facebook’s Audience Insights tool, input your top competitors/brands and take a look at the audience make-up. For example, if you’re a cosmetics store/brand, you could put in audiences that have interest in Sephora and understand various traits such as demographic info and likes/interests. This can help expand on different personas to build and test in Facebook.


Google has a similar insights tool through which you can leverage Google’s data on your converting audiences to understand any additional traits and behaviors you may not have already known. Here is an example:

You can develop personas using the above information and craft additional audiences in Facebook to test. In the above scenario, for example, you may decide to create the audience “Female, age 25-34, Interests: Fashionista, fashion, etc.,” and target this exactly in Facebook (see below).

With the information presented to you from Google Insights on your existing customers/converters, you should be able to develop a variety of different personas, then create audiences based on those personas and test them in Facebook. For example, let’s say you’re selling machines that make single servings of popcorn. Your audience is probably full of young, single people who are huge Netflix fans or sports fans, for example. Popcorn is also gluten-free, so that gives you a huge segment to target if you haven’t already thought of it.

Get creative

It’s important to think about ways you can find new audiences without pulling the obvious levers. For example, if you know that your customers have a high household income, it’s likely you’re already targeting those incomes in Facebook and Google. But what are other ways to reach these people?

Target those who like and purchase more expensive brands. This will open doors to larger audiences (Facebook may not know their house-hold income, but since they purchase high-end products, chances are you are getting in front of relevant eyes). Another example: if you know your customers are ‘fashionistas’, then you can target those who like specific fashion bloggers (e.g. interests: Chiara Ferragni, Olivia Palermo).

You should also look at the top-converting placements in your Google Display Network (GDN) campaigns. If you’re spending a significant budget within GDN, this information can be very telling. For example, when running ads on a luxury home furniture site, we discovered that a large chunk of their converters were on celebrity gossip sites. You can take that information and craft an audience to target within Facebook.

Of course, if you have a bigger budget, you can (and should) invest in analytics software and support that pulls third-party information, and information from people visiting your site. But you can get a lot of insight for free – and should be taking advantage of that no matter how refined your paid analytics are.

Skip to content