SEO Articles

How to get the most out of PageRank and boost rankings

How to get the most out of PageRank and boost rankings

There are 100s of signals that help Google understand and rank content and one signal in particular, PageRank, is often not fully taken advantage of on large scale websites.

Harnessing PageRank can provide the boost you need to gain traction in search results once you’ve covered some of the SEO basics, such as on-page optimisation.

But first, let’s define what PageRank is and how it works.

What is PageRank?

PageRank is a metric used by Google to determine the authority of a page, based on inbound links.

Unlike other ranking factors, Google used to explicitly give you a PageRank score of a webpage; however, Google decided to retire the public PageRank  metric.

Although the public can no longer see PageRank scores, PageRank itself is a signal still used by Google, but many websites don’t efficiently harness its potential to boost rankings.

DYK that after 18 years we’re still using PageRank (and 100s of other signals) in ranking?

Wanna know how it works?https://t.co/CfOlxGauGF pic.twitter.com/3YJeNbXLml

— Gary “鯨理” Illyes (@methode) February 9, 2017

How is PageRank calculated?

The methodology to calculate PageRank has evolved since the first introduction of Larry Page’s (a co-founder of Google) PageRank patent.

“Even when I joined the company in 2000, Google was doing more sophisticated link computation than you would observe from the classic PageRank papers”

Matt Cutts

The original PageRank calculation would equally divide the amount of PageRank a page held by the number of outbound links found on a given page. As illustrated in the diagram above, page A has a PageRank of 1 and has two outbound links to page B and page C, which results in both page B and C receiving 0.5 PageRank.

However, we need to add one more aspect to our basic model. The original PageRank patent also cites what is known as the damping factor, which deducts approx. 15% PageRank for every time a link points to another page as illustrated below. The damping factor prevents artificial concentration of rank importance within loops of the web and is still used today for PageRank computation.

PageRank and the reasonable surfer model

The way PageRank is currently worked out is likely far more sophisticated than the original calculation, a notable example of this would be the reasonable surfer model, which may adjust the amount of PageRank that gets allocated to a link based on the probability it will be clicked. For instance, a prominent link placed above the fold is more likely to be clicked on than a link found at the bottom of a page and therefore may receive more PageRank.

PageRank simplified

An easy way to understand how PageRank works is to think that every page has a value and the value is split between all the pages it links to.

So, in theory, a page that has attained quality inbound links and is well linked to internally, has a much better chance of outranking a page that has very little inbound or internal links pointing to it.

How to harness PageRank?
If you don’t want to waste PageRank, don’t link to unimportant pages!

Following on from the previous explanation of PageRank, the first solution to harness PageRank is to simply not link to pages you don’t want to rank, or at the very least reduce the number of internal links that point to unimportant pages. For example, you’ll often see sites that stuff their main navigation with pages that don’t benefit their SEO, or their users.

However, some sites are setup in such a way that make it challenging to harness PageRank and below are some implementations and tips that can help you get the most out of PageRank in these kind of situations.

# fragments
What is a # fragment?

The # fragment is often added at the end of a URL to send users to a specific part of a page (called an anchor) and to control indexing and distribution of PageRank.

How to use # fragments?

When the goal is to prevent a large number of pages from being indexed, direct and preserve PageRank, # fragments should be added after the most important folder in your URL structure, as illustrated in example A.

We have two pages:

Example A

www.example.com/clothing/shirts#colours=pink,black

URL with a # fragment

Example B

www.example.com/clothing/shirts,colours=pink,black

URL without a # fragment

There is unlikely to be much, if any, specific search demand for a combination of pink and black shirts that warrants a standalone page. Indexing these types of pages will dilute your PageRank and potentially cause indexing bloat, where similar variations of a page compete against each other in search results and reduce the overall quality of your site. So you’ll be better off consolidating and directing PageRank to the main /shirts page.

Google will consider anything that’s placed after a # fragment in a URL to be part of the same document, so www.example.com/clothing/shirts#colours=pink,black should return www.example.com/clothing/shirts in search results. It’s a form of canonicalisation.

if page.php#a loads different content than page.php#b , then we generally won’t be able to index that separately. Use “?” or “/”

— John (@JohnMu) February 22, 2017

We fold #a & #b together, for indexing too, so all signals come together.

— John (@JohnMu) April 5, 2017

Pros:

# fragment URLs should consolidate PageRank to the desired page and prevent pages you don’t want to rank from appearing in search results.

Crawl resource should be focused on pages you want to rank.

Cons:

Adding # fragments can be challenging for most frameworks.

Using # fragments can be a great way to concentrate PageRank to pages you want to rank and prevent pages from being indexed, meaning # fragment implementation is particularly advantageous for faceted navigation.

Canonicalisation
What is canonicalisation?

rel=”canonical”  ‘suggests’ a preferred version of a page and can be added as an HTML tag or as an HTTP header. rel=”canonical” is often used to consolidate PageRank and prevent low-quality pages from being indexed.

How to use canonicalisation?

Going back to our shirt example…

We have two pages:

Example A

www.example.com/clothing/shirts/

Category shirt page

Example B

www.example.com/clothing/shirts,colours=pink,black

Category shirt page with selected colours

Page B type pages can often come about as a result of faceted navigation, so by making the rel=”canonical” URL on page B, mirror the rel=”canonical” URL on page A, you are signalling to search engines that page A is the preferred version and that any ranking signals, including PageRank, should be transferred to page A.

However, there are disadvantages with a canonicalisation approach as discussed below.

Pros:

Can transfer PageRank to pages you want to rank.

Can prevent duplicate/low-quality pages from being indexed.

Cons:

A canonical tag is a suggestive signal to search engines, not a directive, so they can choose to ignore your canonicalisation hints. You can read the following webmaster blog to help Google respect your canonicalisation hints.

Google has suggested that canonicals are treated like 301 redirects and in combination with the original PageRank patent, this implies that not all PageRank will pass to the specified canonical URL.

Even though canonicalised pages are crawled less frequently than indexed pages, they still get crawled. In certain situations, such as large-scale faceted navigation, the sheer amount of overly dynamic URLs can eat into your websites crawl budget, which can have an indirect impact on your site’s visibility.

Overall, if choosing a canonicalisation approach, be confident that Google will respect your canonicalisation suggestions and that you’ve considered the potential impact to your sites crawl budget if you have a large number of pages you want to canonicalise.

Make sure internal links return a 200 response code

Arguably one of the quickest wins in preserving PageRank is to update all internal links on a website so that they return a 200 response code.

We know from the original PageRank patent that each link has a damping factor of approx. 15%. So in cases where sites have a large number of internal links that return response codes other than 200, such as 3xx, updating them will reclaim PageRank.

As illustrated below, there is a chain of 301 redirects. Each 301 redirect results in a PageRank loss of 15%. Now imagine the amplified loss in PageRank if there were hundreds, or thousands of these redirects across a site.

This is an extremely common issue, but not exclusive, to sites that have undergone a migration. The exception to the rule of losing 15% PageRank through a 301 redirect is when a site migrates from HTTP to HTTPS. Google has been strongly encouraging sites to migrate to HTTPS for a while now and as an extra incentive to encourage more HTTPS migrations, 3xx redirects from HTTP to HTTPS  URLs will not cause PageRank to be lost.

Download Screaming Frog for free and check out their guide for identifying internal links that return a non 200 response code.

Reclaiming PageRank from 404 pages

Gaining inbound links is the foundation for increasing the amount of PageRank that can be dispersed across your site, so it really hurts when pages that have inbound links pointing to them return a 404 error. Pages that return a 404 error no longer exist and therefore can’t pass on PageRank.

Use tools such as Moz’s Link Explorer to identify 404 pages that have accumulated inbound links and 301 redirect them to an equivalent page to reclaim some of the PageRank.

However, unless there is not an appropriate equivalent page to redirect these URLs to, avoid redirecting 404 pages to your homepage. Redirecting pages to your homepage will likely result in little, if any, PageRank being reclaimed, due to the differences in the original content and your homepage.

Things to avoid
rel=”nofollow”

The use of rel=”nofollow” is synonymous with an old-school SEO tactic, whereby SEO’s tried to ‘sculpt’ the flow of PageRank by adding rel=”nofollow” to internal links that were deemed unimportant. The goal was to strategically manage how PageRank gets distributed throughout a website.

The rel=”nofollow” attribute was originally introduced by Google to fight comment link spam, where people would try to boost their backlink profile by inserting links into comment sections on blogs, articles or forum posts.

This tactic has been redundant for many years now, as Google changed how rel=”nofollow” worked. Now, PageRank sent out with every link is divided by the total amount of links on a page, rather than the amount of followed links.

However, specifically adding rel=”nofollow” to a link will mean PageRank that does flow through will not benefit the destination page and thus result in PageRank attrition. Additionally, the attrition of PageRank also applies to URLs you’ve disallowed in your robots.txt file, or pages that have had a noindex tag in place for a while.

Embedded JavaScript links

Content that’s reliant on JavaScript has been shown to negatively impact organic search performance. Essentially, it’s an inefficient process for Google to render, understand and then evaluate client-side rendered content and some in the SEO industry believe it’s possible to sculpt PageRank by making unimportant internal links embedded JavaScript links.

I decided to ask John Mueller whether JavaScript embedded links receive less PageRank than links found in the HTML and he responded unequivocally.

yes

— John (@JohnMu) October 18, 2018

JavaScript SEO is a complex topic, where the dynamic is constantly evolving, and Google is increasingly getting better at understanding and processing JavaScript.

However, if you’re going to use a JavaScript framework, make sure that Google is able to fully render your content.

Conclusion

PageRank is still an influential ranking signal and preserving, directing and ultimately harnessing this signal should be apart of any plan when trying to boost your organic search visibility. Every website’s situation is unique, and a one size fits all approach will not always apply, but hopefully, this blog highlights some potential quick wins and tactics to avoid. Let me know in the comment section if I’ve missed anything out?

Read More

Ecommerce SEO Guide: SEO Best Practices for Ecommerce Websites

Ecommerce SEO Guide: SEO Best Practices for Ecommerce Websites

If you want to get more traffic and sales to your ecommerce website, then on-page SEO is a critical first step. There’s a multitude of how-to articles and tutorials on the web offering general SEO advice, but far fewer that specifically address the needs of ecommerce entrepreneurs. Today, we’d like to give you a basic understanding of on-site search engine optimization for ecommerce. It will be enough to get you started, make sure you’re sending all the right signals to Google, and set you up for SEO success. Let’s dive in. What is Ecommerce SEO? Definition Ecommerce SEO is the…

The post Ecommerce SEO Guide: SEO Best Practices for Ecommerce Websites appeared first on The Daily Egg.

Read More

Why Local Businesses Will Need Websites More than Ever in 2019

Why Local Businesses Will Need Websites More than Ever in 2019

Posted by MiriamEllis

64% of 1,411 surveyed local business marketers agree that Google is becoming the new “homepage” for local businesses. Via Moz State of Local SEO Industry Report

…but please don’t come away with the wrong storyline from this statistic.

As local brands and their marketers watch Google play Trojan horse, shifting from top benefactor to top competitor by replacing former “free” publicity with paid packs, Local Service Ads, zero-click SERPs, and related structures, it’s no surprise to see forum members asking, “Do I even need a website anymore?”

Our answer to this question is,“Yes, you’ve never needed a website more than you will in 2019.” In this post, we’ll examine:

Why it looks like local businesses don’t need websites
Statistical proofs of why local businesses need websites now more than ever
The current status of local business websites and most-needed improvements
How Google stopped bearing so many gifts

Within recent memory, a Google query with local intent brought up a big pack of ten nearby businesses, with each entry taking the user directly to these brands’ websites for all of their next steps. A modest amount of marketing effort was rewarded with a shower of Google gifts in the form of rankings, traffic, and conversions.

Then these generous SERPs shrank to seven spots, and then three, with the mobile sea change thrown into the bargain and consisting of layers and layers of Google-owned interfaces instead of direct-to-website links. In 2018, when we rustle through the wrapping paper, the presents we find from Google look cheaper, smaller, and less magnificent.

Consider these five key developments:

1) Zero-click mobile SERPs

This slide from a recent presentation by Rand Fishkin encapsulates his findings regarding the growth of no-click SERPs between 2016–2018. Mobile users have experienced a 20% increase in delivery of search engine results that don’t require them to go any deeper than Google’s own interface.

2) The encroachment of paid ads into local packs

When Dr. Peter J. Myers surveyed 11,000 SERPs in 2018, he found that 35% of competitive local packs feature ads.

3) Google becoming a lead gen agency

At last count, Google’s Local Service Ads program via which they interposition themselves as the paid lead gen agent between businesses and consumers has taken over 23 business categories in 77 US cities.

4) Even your branded SERPs don’t belong to you

When a user specifically searches for your brand and your Google Knowledge Panel pops up, you can likely cope with the long-standing “People Also Search For” set of competitors at the bottom of it. But that’s not the same as Google allowing Groupon to advertise at the top of your KP, or putting lead gen from Doordash and GrubHub front and center to nickel and dime you on your own customers’ orders.

5) Google is being called the new “homepage” for local businesses

As highlighted at the beginning of this post, 64% of marketers agree that Google is becoming the new “homepage” for local businesses. This concept, coined by Mike Blumenthal, signifies that a user looking at a Google Knowledge Panel can get basic business info, make a phone call, get directions, book something, ask a question, take a virtual tour, read microblog posts, see hours of operation, thumb through photos, see busy times, read and leave reviews. Without ever having to click through to a brand’s domain, the user may be fully satisfied.

“Nothing is enough for the man to whom enough is too little.”
– Epicurus

There are many more examples we could gather, but they can all be summed up in one way: None of Google’s most recent local initiatives are about driving customers to brands’ own websites. Local SERPs have shrunk and have been re-engineered to keep users within Google’s platforms to generate maximum revenue for Google and their partners.

You may be as philosophical as Epicurus about this and say that Google has every right to be as profitable as they can with their own product, even if they don’t really need to siphon more revenue off local businesses. But if Google’s recent trajectory causes your brand or agency to conclude that websites have become obsolete in this heavily controlled environment, please keep reading.

Your website is your bedrock

“65% of 1,411 surveyed marketers observe strong correlation between organic and local rank.” – Via Moz State of Local SEO Industry Report

What this means is that businesses which rank highly organically are very likely to have high associated local pack rankings. In the following screenshot, if you take away the directory-type platforms, you will see how the brand websites ranking on page 1 for “deli athens ga” are also the two businesses that have made it into Google’s local pack:

How often do the top 3 Google local pack results also have a 1st page organic rankings?

In a small study, we looked at 15 head keywords across 7 US cities and towns. This yielded 315 possible entries in Google’s local pack. Of that 315, 235 of the businesses ranking in the local packs also had page 1 organic rankings. That’s a 75% correlation between organic website rankings and local pack presence.

*It’s worth noting that where local and organic results did not correlate, it was sometimes due the presence of spam GMB listings, or to mystery SERPs that did not make sense at first glance — perhaps as a result of Google testing, in some cases.

Additionally, many local businesses are not making it to the first page of Google anymore in some categories because the organic SERPs are inundated with best-of lists and directories. Often, local business websites were pushed down to the second page of the organic results. In other words, if spam, “best-ofs,” and mysteries were removed, the local-organic correlation would likely be much higher than 75%.

Further, one recent study found that even when Google’s Local Service Ads are present, 43.9% of clicks went to the organic SERPs. Obviously, if you can make it to the top of the organic SERPs, this puts you in very good CTR shape from a purely organic standpoint.

Your takeaway from this

The local businesses you market may not be able to stave off the onslaught of Google’s zero-click SERPs, paid SERPs, and lead gen features, but where “free” local 3-packs still exist, your very best bet for being included in them is to have the strongest possible website. Moreover, organic SERPs remain a substantial source of clicks.

Far from it being the case that websites have become obsolete, they are the firmest bedrock for maintaining free local SERP visibility amidst an increasing scarcity of opportunities.

This calls for an industry-wide doubling down on organic metrics that matter most.

Bridging the local-organic gap“We are what we repeatedly do. Excellence, then, is not an act, but a habit.”
– Aristotle

A 2017 CNBC survey found that 45% of small businesses have no website, and, while most large enterprises have websites, many local businesses qualify as “small.”

Moreover, a recent audit of 9,392 Google My Business listings found that 27% have no website link.

When asked which one task 1,411 marketers want clients to devote more resources to, it’s no coincidence that 66% listed a website-oriented asset. This includes local content development, on-site optimization, local link building, technical analysis of rankings/traffic/conversions, and website design as shown in the following Moz survey graphic:

In an environment in which websites are table stakes for competitive local pack rankings, virtually all local businesses not only need one, but they need it to be as strong as possible so that it achieves maximum organic rankings.

What makes a website strong?

The Moz Beginner’s Guide to SEO offers incredibly detailed guidelines for creating the best possible website. While we recommend that everyone marketing a local business read through this in-depth guide, we can sum up its contents here by stating that strong websites combine:

Technical basics
Excellent usability
On-site optimization
Relevant content publication
Publicity

For our present purpose, let’s take a special look at those last three elements.

On-site optimization and relevant content publication

There was a time when on-site SEO and content development were treated almost independently of one another. And while local businesses will need a make a little extra effort to put their basic contact information in prominent places on their websites (such as the footer and Contact Us page), publication and optimization should be viewed as a single topic. A modern strategy takes all of the following into account:

Keyword and real-world research tell a local business what consumers want
These consumer desires are then reflected in what the business publishes on its website, including its homepage, location landing pages, about page, blog and other components
Full reflection of consumer desires includes ensuring that human language (discovered via keyword and real-world research) is implemented in all elements of each page, including its tags, headings, descriptions, text, and in some cases, markup

What we’re describing here isn’t a set of disconnected efforts. It’s a single effort that’s integral to researching, writing, and publishing the website. Far from stuffing keywords into a tag or a page’s content, focus has shifted to building topical authority in the eyes of search engines like Google by building an authoritative resource for a particular consumer demographic. The more closely a business is able to reflect customers’ needs (including the language of their needs), in every possible component of its website, the more relevant it becomes.

A hypothetical example of this would be a large medical clinic in Dallas. Last year, their phone staff was inundated with basic questions about flu shots, like where and when to get them, what they cost, would they cause side effects, what about side effects on people with pre-existing health conditions, etc. This year, the medical center’s marketing team took a look at Moz Keyword Explorer and saw that there’s an enormous volume of questions surrounding flu shots:

This tiny segment of the findings of the free keyword research tool, Answer the Public, further illustrates how many questions people have about flu shots:

The medical clinic need not compete nationally for these topics, but at a local level, a page on the website can answer nearly every question a nearby patient could have about this subject. The page, created properly, will reflect human language in its tags, headings, descriptions, text, and markup. It will tell all patients where to come and when to come for this procedure. It has the potential to cut down on time-consuming phone calls.

And, finally, it will build topical authority in the eyes of Google to strengthen the clinic’s chances of ranking well organically… which can then translate to improved local rankings.

It’s important to note that keyword research tools typically do not reflect location very accurately, so research is typically done at a national level, and then adjusted to reflect regional or local language differences and geographic terms, after the fact. In other words, a keyword tool may not accurately reflect exactly how many local consumers in Dallas are asking “Where do I get a flu shot?”, but keyword and real-world research signals that this type of question is definitely being asked. The local business website can reflect this question while also adding in the necessary geographic terms.

Local link building must be brought to the fore of publicity efforts

Moz’s industry survey found that more than one-third of respondents had no local link building strategy in place. Meanwhile, link building was listed as one of the top three tasks to which marketers want their clients to devote more resources. There’s clearly a disconnect going on here. Given the fundamental role links play in building Domain Authority, organic rankings, and subsequent local rankings, building strong websites means bridging this gap.

First, it might help to examine old prejudices that could cause local business marketers and their clients to feel dubious about link building. These most likely stem from link spam which has gotten so out of hand in the general world of SEO that Google has had to penalize it and filter it to the best of their ability.

Not long ago, many digital-only businesses were having a heyday with paid links, link farms, reciprocal links, abusive link anchor text and the like. An online company might accrue thousands of links from completely irrelevant sources, all in hopes of escalating rank. Clearly, these practices aren’t ones an ethical business can feel good about investing in, but they do serve as an interesting object lesson, especially when a local marketer can point out to a client, that best local links are typically going to result from real-world relationship-building.

Local businesses are truly special because they serve a distinct, physical community made up of their own neighbors. The more involved a local business is in its own community, the more naturally link opportunities arise from things like local:

Sponsorships
Event participation and hosting
Online news
Blogs
Business associations
B2B cross-promotions

There are so many ways a local business can build genuine topical and domain authority in a given community by dint of the relationships it develops with neighbors.

An excellent way to get started on this effort is to look at high-ranking local businesses in the same or similar business categories to discover what work they’ve put in to achieve a supportive backlink profile. Moz Link Intersect is an extremely actionable resource for this, enabling a business to input its top competitors to find who is linking to them.

In the following example, a small B&B in Albuquerque looks up two luxurious Tribal resorts in its city:

Link Intersect then lists out a blueprint of opportunities, showing which links one or both competitors have earned. Drilling down, the B&B finds that Marriott.com is linking to both Tribal resorts on an Albuquerque things-to-do page:

The small B&B can then try to earn a spot on that same page, because it hosts lavish tea parties as a thing-to-do. Outreach could depend on the B&B owner knowing someone who works at the local Marriott personally. It could include meeting with them in person, or on the phone, or even via email. If this outreach succeeds, an excellent, relevant link will have been earned to boost organic rank, underpinning local rank.

Then, repeat the process. Aristotle might well have been speaking of link building when he said we are what we repeatedly do and that excellence is a habit. Good marketers can teach customers to have excellent habits in recognizing a good link opportunity when they see it.

Taken altogether

Without a website, a local business lacks the brand-controlled publishing and link-earning platform that so strongly influences organic rankings. In the absence of this, the chances of ranking well in competitive local packs will be significantly less. Taken altogether, the case is clear for local businesses investing substantially in their websites.

Acting now is actually a strategy for the future

“There is nothing permanent except change.”
– Heraclitus

You’ve now determined that strong websites are fundamental to local rankings in competitive markets. You’ve absorbed numerous reasons to encourage local businesses you market to prioritize care of their domains. But there’s one more thing you’ll need to be able to convey, and that’s a sense of urgency.

Right now, every single customer you can still earn from a free local pack listing is immensely valuable for the future.

This isn’t a customer you’ve had to pay Google for, as you very well might six months, a year, or five years from now. Yes, you’ve had to invest plenty in developing the strong website that contributed to the high local ranking, but you haven’t paid a penny directly to Google for this particular lead. Soon, you may be having to fork over commissions to Google for a large portion of your new customers, so acting now is like insurance against future spend.

For this to work out properly, local businesses must take the leads Google is sending them right now for free, and convert them into long-term, loyal customers, with an ultimate value of multiple future transactions without Google as a the middle man. And if these freely won customers can be inspired to act as word-of-mouth advocates for your brand, you will have done something substantial to develop a stream of non-Google-dependent revenue.

This offer may well expire as time goes by. When it comes to the capricious local SERPs, marketers resemble the Greek philosophers who knew that change is the only constant. The Trojan horse has rolled into every US city, and it’s a gift with a questionable shelf life. We can’t predict if or when free packs might become obsolete, but we share your concerns about the way the wind is blowing.

What we can see clearly right now is that websites will be anything but obsolete in 2019. Rather, they are the building blocks of local rankings, precious free leads, and loyal revenue, regardless of how SERPs may alter in future.

For more insights into where local businesses should focus in 2019, be sure to explore the Moz State of Local SEO industry report:

Read the State of Local SEO industry report

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Read More

An introduction to HTTP/2 for SEOs

An introduction to HTTP/2 for SEOs

In the mid 90s there was a famous incident where an email administrator at a US University fielded a phone call from a professor who was complaining his department could only send emails 500 miles. The professor explained that whenever they tried to email anyone farther away their emails failed — it sounded like nonsense, but it turned out to actually be happening. To understand why, you need to realise that the speed of light actually has more impact on how the internet works than you may think. In the email case, the timeout for connections was set to about 6 milliseconds – if you do the maths that is about the time it takes for light to travel 500 miles.

We’ll be talking about trucks a lot in this blog post!

The time that it takes for a network connection to open across a distance is called latency, and it turns out that latency has a lot to answer for. Latency is one of the main issues that affects the speed of the web, and was one of the primary drivers for why Google started inventing HTTP/2 (it was originally called SPDY when they were working on it, before it became a web standard).

HTTP/2 is now an established standard and is seeing a lot of use across the web, but is still not as widespread as it could be across most site. It is an easy opportunity to improve the speed of your website, but it can be fairly intimidating to try to understand it.

In this post I hope to provide an accessible top-level introduction to HTTP/2, specifically targeted towards SEOs. I do brush over some parts of the technical details and don’t cover all the features of HTTP/2, but my aim here isn’t to give you an exhaustive understanding, but instead to help you understand the important parts in the most accessible way possible.

HTTP 1.1 – The Current Norm

Currently, when request a web page or other resource (such as images, scripts, CSS files etc.), your browser speaks HTTP to a server in order to communicate. The current version is HTTP/1.1, which has been the standard for the last 20 years, with no changes.

Anatomy of a Request

We are not going to drown in the deep technical details of HTTP too much in this post, but we are going to quickly touch on what a request looks like. There are a few bits to a request:

The top line here is saying what sort of request this is (GET is the normal sort of request, POST is the other main one people know of), and what URL the request is for (in this case /anchorman/) and finally which version of HTTP we are using.

The second line is the mandatory ‘host’ header which is a part of all HTTP 1.1 requests, and covers the situation that often a single webserver may be hosting multiple websites and it needs to know which are you are looking for.

Finally there will a variety of other headers, which we are not going to get into. In this case I’ve shown the User Agent header which indicates which sort of device and software (browser) you are using to connect to the website.

HTTP = Trucks!

In order to help explain and understand HTTP and some of the issues, I’m going to draw an analogy between HTTP and … trucks! We are going to imagine that an HTTP request being sent from your browser is a truck that has to drive from your browser over to the server:

A truck represents an HTTP request/response to a server

In this analogy, we can imagine that the road itself is the network connection (TCP/IP, if you want) from your computer to the server:

The road is a network connection – the transport layer for our HTTP Trucks

Then a request is represented by a truck, that is carrying a request in it:

HTTP Trucks carry a request from the browser to the server

The response is the truck coming back with a response, which in this case is our HTML:

HTTP Trucks carry a response back from the server to the browser

“So what is the problem?! This all sounds great, Tom!” – I can hear you all saying. The problem is that in this model, anyone can stare down into the truck trailers and see what they are hauling. Should an HTTP request contain credit card details, personal emails, or anything else sensitive anybody can see your information.

HTTP Trucks aren’t secure – people can peek at them and see what they are carrying

HTTPS

HTTPS was designed to combat the issue of people being able to peek into our trucks and see what they are carrying.

Importantly, HTTPS is essentially identical to HTTP – the trucks and the requests/responses they transport at the same as they were. The response codes and headers are all the same.

The difference all happens at the transport (network) layer, we can imagine it as a over our road:

In HTTPS, requests & responses are the same as HTTP. The road is secured.

In the rest of the article, I’ll imagine we have a tunnel over our road, but won’t show it – it would be boring if we couldn’t see our trucks!

Impact of Latency

So the main problem with this model is related to the top speed of our trucks. In the 500-mile email introductory story we saw that the speed of light can have a very real impact on the workings of the internet.

HTTP Trucks cannot go fast than the speed of light.

HTTP requests and many HTTP responses tend to be quite small. However, our trucks can only travel at the speed of light, and so even these small requests can take time to go back and forth from the user to the website. It is tempting to think this won’t have a noticeable impact on website performance, but it is actually a real problem…

HTTP Trucks travel at a constant speed, so longer roads mean slower responses.

The farther the distance of the network connection between a user’s browser and the web server (the length of our ‘road’) the farther the request and response have to travel, which means they take longer.

Now consider that a typical website is not a single request and response, but is instead a sequence of many requests and responses. Often a response will mean more requests are required – for example, an HTML file probably references images, CSS files and JavaScript files:

Some of these files then may have further dependencies, and so on. Typically websites may be 50-100 separate requests:

Web pages nowadays often require 50-100 separate HTTP requests.

Let’s look at how that may look for our trucks…

Send a request for a web page:

We send a request to the web server for a page.

Request travels to server:

The truck (request) may take 50ms to drive to the server.

Response travels back to browser:

And then 50ms to drive back with the response (ignoring time to compile the response!).

The browser parses the HTML response and realises there are a number of other files that are needed from the server:

After parsing the HTML, the browser identifies more assets to fetch. More requests to send!

Limit of HTTP/1.1

The problem we now encounter is that there are several more files we need to fetch, but with an HTTP/1.1 connection each road can only handle a single truck at a time. Every HTTP request needs its own TCP (networking) connection, and each truck can only carry one request at a time.

Each truck (request) needs its own road (network connection).

Furthermore, building a new road, or opening a new networking connection also requires a round trip. In our world of trucks we can liken this to needing a stream roller to first lay the road and then add our road markings. This is another whole round trip which adds more latency:

New roads (network connections) require work to open them.

This means another whole round trip to open new connections.

Typically browsers open around 6 simultaneous connections at once:

Browsers usually open 6 roads (network connections).

However, if we are looking at 50-100 files needed for a webpage we still end up in the situation where trucks (requests) have to wait their turn. This is called ‘head of line blocking’:

Often trucks (requests) have to wait for a free road (network connection).

If we look at the waterfall diagram for a page (this example this HTTP/2 site) of a simple page that has a CSS file and lot of images you can see this in action:

Waterfall diagrams highlight the impact of round trips and latency.

In the diagram above, the orange and purple segments can be thought of as our stream rollers, where new connections are made. You can see initially there is just one connection open (line 1), and another connection being opened. Line 2 then re-uses the first connection and line 3 is the first request over the second connection. When those complete lines 4 & 5 are the next two images.

At this point the browser realises it will need more connections so four more are opened and then we can see requests are going in batches of 6 at a time corresponding with the 6 roads or network connections that are open.

Latency vs Bandwidth

In the waterfall diagram above, each of these images may be small but each requires a truck to come and fetch it. This means lots of round trips, and given we can only run 6 simultaneously at a time there is a lot of time spent with requests waiting.

It is sometimes difficult to understand the difference between bandwidth and latency. Bandwidth could be thought of as the load capacity of our trucks, where each truck could carry more. This often doesn’t help with webpage times though, given each request and response cannot share a truck with another request. This is why it has been shown that increasing bandwidth has a limited impact on the load time of pages. This was shown in research conducted by Mike Belshe at Google which is discussed in this article from Googler Ilya Grigorik:

The reality was clear that in order to improve the performance of the web, the issue of latency would need to be addressed. The research above was what led to Google developing the SPDY protocol which later turned into HTTP/2.

Improving the impact of latency

In order to improve the impact that latency has on website load times, there are various strategies that have been employed. One of these is ‘sprite maps’ which take lots of small images and jam them together into single files:

Sprite maps are a trick used to reduce the number of trucks (requests) needed.

The advantage of sprite maps is that they can all be put into one truck (request/response) as they are just a single file. Then clever use of CSS can display just the portion of the image that corresponds to the desired image. One file means only a single request and response are required to fetch them, which reduces the number of round trips required.

Another thing that helps to reduce latency is using a CDN platform, such as CloudFlare or Fastly, to host your static assets (images, CSS files etc. – things that are not dynamic and the same for every visitor) on servers all around the world. This means that the round trips for users can be along a much shorter road (network connection) because there will be a nearby server that can provide them with what they need.

CDNs have servers all around the world, can make the required roads (network connections) shorter.

CDNs also provide a variety of other benefits, but latency reduction is a headline feature.

HTTP/2 – The New World

So hopefully, you have now realised that HTTP/2 can help reduce latency and dramatically improve the performance of pages. How does it go about it!

Introducing Multiplexing – More trucks to the rescue!

With HTTP/2 we are allowed multiplexing, which essentially means we are allowed to have more than one truck on each road:

With HTTP/2 a road (network connection) can handle many trucks (requests/responses).

We can immediately see the change in behaviour on a waterfall diagram – compare this with the one above (not the change in the scale too – this is a lot faster):

We now only need one road (connection) then all our trucks (requests) can share it!

The exact speed benefits you may get depend on a lot of other factors, but by removing the problem of head of line blocking (trucks having to wait) we can immediately get a lot of benefits, for almost no cost to us.

Same old trucks

With HTTP/2 our trucks and their contents stay essentially the same as they they always were, we can just imagine we have a new traffic management system.

Requests look as they did before:

The same response codes exist and mean the same things:

Because the content of the trucks doesn’t change, this is great news for implementing HTTP/2 – your web platform or CMS does not need to be changed and your developers don’t need to write any code! We’ll discuss this below.

Server Push

A much anticipated feature of HTTP/2 is ‘Server Push’ which allows a server to respond to a single request with multiple responses. Imagine a browser requests an HTML file but the server knows that that means the server will need a specific CSS file and a specific JS file as well. Then the server can just send those straight back, without needing them to be requested:

Server Push: A single truck (request) is sent…

Server Push: … but multiple trucks (responses) are sent back.

The benefit is obvious- it removes another whole round trip for each resource that the server can ‘anticipate’ that the client will need.

The downside is that at the moment this is often implemented badly, and it can mean the server sends trucks that the client doesn’t need (as it has cached the response from earlier) which means you can make things worse.

For now, unless you are very sure you know what you are doing you should avoid server push.

Implementing HTTP/2

Ok – this sounds great, right? Now you should be wondering how you can turn it on!

The most important thing is to understand that because the requests and responses are the same as they always were, you do not need to update the code on your site at all. You need to update your server to speak HTTP/2 – and then it will do the new ‘traffic management’ for you.

If that seems hard (or if you already have one) you can instead use a CDN to help you deploy HTTP/2 to your users. Something like CloudFlare, or Fastly (my favourite CDN – it requires more advanced knowledge to setup but is super flexible) would sit in front of your webserver and speaking HTTP/2 to your users:

A CDN can speak HTTP/2 for you whilst your server speaks HTTP/1.1.

Because the CDN will cache your static assets, like images, CSS files, Javascript files and fonts, you still get the benefits of HTTP/2 even though your server is still in a single truck world.

HTTP/2 is not another migration! 

It is important to realise that to get HTTP/2 you will need to already have HTTPS, as all the major browsers will only speak HTTP/2 when using a secure connection:

HTTP/2 requires HTTPS

However, setting up HTTP/2 does not require a migration in the same way as HTTPS did. With HTTPS your URLs were changing from http://example.com to https://example.com and you required 301 redirects, and a new Google Search Console account and a week long meditation retreat to recover from the stress.

With HTTP/2 your URLs will not change, and you will not require redirects or anything like that. For browsers and devices that can speak HTTP/2 they will do that (it is actually the guy in the steamroller who communicates that part – but that is a-whole-nother story..!), and other devices will fall back to speaking HTTP/1.1 which is just fine.

We also know that Googlebot does not speak HTTP/2 and will still use HTTP/1.1:

https://moz.com/blog/challenging-googlebot-experiment

However, don’t despair – Google will still notice that you have made things better for users, as we know they are now using usage data from Chrome users to measure site speed in a distributed way:

https://moz.com/blog/google-chrome-usage-data-measure-site-speed

This means that Google will notice the benefit you have provided to users with HTTP/2, and that information will make it back into Google’s evaluation of your site.

Detecting HTTP/2

If you are interested in whether a specific site is using HTTP/2 there are a few ways you can go about it.

My preferred approach is to turn on the ‘Protocol’ column in the Chrome developer tools. Open up the dev tools, go to the ‘Network’ tab and if you don’t see the column then right click to add it from the dropdown:

Alternatively, you can install this little Chrome Extension which will indicate if a site is using it (but won’t give you the breakdown for every connection you’ll get from doing the above):

https://dis.tl/showhttp2

Slide Deck

If you would prefer to consume this as a slide deck, then you can find it on Slideshare. Feel free to re-use the deck in part or its entirety, provided you provide attribution (@TomAnthonySEO):

An introduction to HTTP/2 & Service Workers for SEOs from Tom Anthony
Wrap Up

Hopefully, you found this useful. I’ve found the truck analogy makes something, that can seem hard to understand, somewhat more accessible. I haven’t covered a lot of the intricate details of HTTP/2 or some of the other functionality, but this should help you understand things a little bit better.

I have, in discussions, extended the analogy in various ways, and would love to hear if you do too! Please jump into the comments below for that, or to ask a question, or just hit me up on Twitter.

Read More

If I Had to Start a Blog From Scratch, I Would…

If I Had to Start a Blog From Scratch, I Would…

You’ve seen me and thousands of other marketers talk about how to make a blog popular. But if you don’t set up your blog correctly, you won’t do well no matter what kind of marketing you do.

And no, I am not talking about the technical setup of your blog. I am talking about the foundation. From what you are blogging about, to how you structure your content… there are a lot of basics people get wrong.

And if you get them wrong, it’s going to be that much harder to get more traffic (and, more importantly, monetize the traffic).

So, if I had to start a blog from scratch again, here are the principles I would follow before even writing my first blog post:

Principle #1: Pick a big enough niche

Unless you are well funded, you have to pick a niche. It’s too hard to compete on a broad level with sites like Huffington Post and Business Insider. They are well funded and are able to produce huge amounts of content from contributors big and small.

And if your niche is too small, it will be hard for you to grow your traffic and monetize your blog as there just won’t be enough people interested in what you are blogging about.

When trying to find a niche, use Google Trends. Make sure to pick a niche that is bigger than “digital marketing” but smaller than “nutrition.”

Principle #2: Don’t stick with one platform

I know I’ve told you that you need to use WordPress as your blogging platform, but it shouldn’t stop there. Why not also use Medium, Tumblr, LinkedIn, and even Facebook?

These are all platforms where you can repurpose your content.

Blogging is competitive, so you’ll need to push your content out on as many platforms to ensure that you’ll get the most eyeballs.

Setting up social accounts across the different platforms is really important. Make sure the branding and imagery are the same across all of them and try to generate some followers by following these steps so that when you start producing unique content you’ll have places to promote.

Principle #3: Control your destiny

Google doesn’t penalize for duplicate content. But that doesn’t mean you should just post your content on every platform without thinking about it.

The only platform that doesn’t have an algorithm that you need to worry about is your own blog. Facebook, Medium, Tumblr, and LinkedIn all have algorithms you can’t fully control.

Always link back out to your site when posting on these other platforms. The more people you can get back to your site, the better chance you will have of growing your traffic and monetizing.

Other platforms like Facebook don’t make it easy for you to generate revenue if you keep your readers on their platform.

Principle #4: Blogging is both about “you” and “I”

Blogging is something that is supposed to be informal. No one wants to read an essay or a white paper.

People want to read stories. They want to be involved in a conversation, and the easiest way to do this is to use the words “you” and “I” within your blog posts.

This one simple change will help you build a deeper connection with your readers. A deeper connection means better monetization in the future.

Principle #5: Always ask questions

At the end of every blog post, always ask a question. If you don’t ask a question, people won’t know what to do next.

By asking a question, a portion of your readers will answer it by leaving a comment. This will increase engagement, which again will make monetization easier in the long run.

Principle #6: You have to stand out

There are over a billion blogs on the web, and that number is continually rising. This just means blogging is going to get even more competitive over time.

So how do you stand out in a crowded marketplace?

You have to go above and beyond. Sadly, there is no single answer as every industry is different, but typically infographics, visuals, and doing the opposite of everyone else in your space will help you stand out.

For example, if everyone in your space writes 1,000-word blog posts, test out writing 10,000-word posts. Or if everyone is using text-based content, test out visual based content like infographics or video.

Principle #7: Your content needs to be portable

People are always on the go these days. Your content needs to be easy to digest.

And no, I am not talking about making your content mobile compatible or leveraging AMP framework (although those are good ideas). I am talking about making your content portable.

For example, creating video-based content or audio-based content (podcasts) are simple ways to make your content portable. For example, it is easier to watch video-based content on your mobile phone when on the bus or listen to podcasts while you are driving.

Principle #8: Content isn’t king unless it’s good

You’ve heard the saying that content is king. But is it really?

The Washington Post publishes over 500 pieces of content per day. The Wall Street Journal is at 240, the New York Times is at 230, and Buzzfeed is around 222.

The list keeps going on and on as there are over 2 million blog posts published daily.

In other words, writing mediocre content isn’t good enough. It won’t do well and you will just be wasting time. So, don’t write content unless it is really, really, really good.

Principle #9: You have to produce quality and quantity

It’s sad, but it is true. Not only does your content have to be amazing, but you have to publish amazing content in quantity.

Just because you are writing an amazing blog post, it doesn’t mean you will do well. Content marketing is a hit or miss game in which your posts will do well or they won’t. And in most cases, your content won’t do as well as you want no matter how good you are at marketing.

To increase your odds of success, you need to be willing to produce amazing content in quantity.

Principle #10: Your blog isn’t always the best place to blog

Especially early on, you need to save your best content for other blogs. From industry blogs to large sites like Entrepreneur and Business Insider… consider placing your best content elsewhere.

Once you’ve been blogging for a year and you have built up an audience, you’ll want to keep your best content for yourself. But in the beginning, placing your best content on more popular blogs will help you increase your brand recognition and audience.

If you aren’t sure on how to craft a guest posting proposal, read this.

Principle #11: Useful content beats viral content

We all dream about viral content, but it’s not easy to produce.

The chances of your content going viral are slim to none. And when your content goes viral it will die down… the question just becomes when.

Instead of focusing on creating viral content (when you have less than a 1% chance of producing it), focus on creating useful content. Useful content tends to be evergreen, which means it can generate steady traffic over time.

Principle #12: It’s easier to build a personal blog than a corporate one

I know I’ve mentioned that I wouldn’t build a personal brand if I started all over again, and I wouldn’t.

But that doesn’t mean that you shouldn’t leverage one. People connect with people more than they connect with corporate brands.

It’s not like you have conversations with Coca-Cola or Nike like you have with a friend.

If you want your blog to be popular faster then go with a personal brand. If you want to build something big and potentially even sell it one day, consider a corporate brand for your blog (even though it will take longer for it become popular).

Principle #13: A blog won’t work without a community

Blogging is about creating conversations. But without readers and community, there is no conversation.

It would just be you talking…

For this reason, you can’t expect to build a popular blog without building up your social profiles.

From running Facebook and Twitter ads, to manually growing your follower counts, you need to focus on your social media game.

The bigger your social following the more people you’ll have to drive to your blog, and the easier it will be to create a community.

Principle #14: No man is an island

As you are building up a community, people will engage with you through comments.

If you don’t respond to every comment, then your community will slowly die down.

Just think of it this way… if you continually talked to someone and they ignored you each and every time, what would you do? Eventually, you would stop talking to them.

Don’t be rude to your community, help them out. Make sure you respond to each and every comment. Not just on your blog, but even when people comment on your social profiles, make sure you respond back.

Principle #15: People don’t read, they skim

Most of the people that come to your website won’t read. Blogs tend to have an average time on site of less than 1 minute.

There is no way your average visitor is going to read your 2,000-word blog post in under a minute. That means people skim.

Make sure you write your content with the assumption that people skim. From leveraging headings to even writing a conclusion at the end of each post, this will help your readers get value out of your content even when they don’t fully read it.

Principle #16: It’s all about the headline

Some people spend 80% of their time writing the content and only 20% promoting it. Others spend 80% on marketing and 20% on the content creation. And some spend 50% of their time writing and 50% promoting.

But what about the headline? Why don’t people spend time crafting and testing amazing headlines?

What most people don’t know is that 8 out of 10 people will read your headline, but only 2 out of 10 will click through and read the rest. So focus on creating amazing headlines or else you won’t get tons of traffic.

Principle #17: Reveal your cards, all of them

Because the blogosphere is competitive, you have no choice but to reveal your cards. From your secrets to the “good stuff”… you’ll have to share it all.

If you don’t share it, you won’t be giving people a reason to read your blog over the billion other ones out there.

When revealing your cards, make sure you do it early on in each blog post. It is a great way to hook your readers and to get them to read the rest of your content.

Principle #18: Consistency will make or break you

When you continually blog, do you know what happens? Your traffic typically stays flat or slowly goes up.

But when you stop or take a break, your traffic will tank. And then when you start up again, your traffic won’t just go back to where it was, you’ll have to fight to gain your traffic back.

I once took a month break from blogging and it took me 3 months to recover my traffic. Literally 3 months.

Don’t start a blog unless you are willing to be consistent. Not just for a few months or a year, but I am talking years (3 plus).

Principle #19: Don’t ever rely on 1 traffic channel

You hear blogs exploding with Facebook traffic or Google traffic. But do you know what happens when those sites change their algorithms?

Your traffic drops.

It’s just a question of when, so expect your traffic to drop. Don’t rely on only one traffic channel.

Before you write your first post, think about which channels you are going to leverage for traffic generation. You need to have an omnichannel approach in which you are leveraging all of the feasible channels out there that work for your niche.

Principle #20: Don’t forget about Google

You should always write for humans and not search engines. But that doesn’t mean you should ignore Google.

Whatever you are considering writing about, make sure you do some basic keyword research. Head over to Ubersuggest first. Then type in a few keywords related to your article and it will show you a list of other popular phrases.

If they are relevant, make sure you blend them into your content.

This one simple thing will help ensure that your content gets the most search traffic that it can possibly generate.

Principle #21: Be willing to kill your baby

When you start a blog, people only talk about writing and marketing. But as your blog gets older your responsibilities will grow.

One of them is the willingness to kill some of your content.

Not all of your content will be relevant a year or two from now. For example, if you write about Vine, which was a company Twitter bought and then shut down, it won’t be relevant anymore. Especially if the article focuses on “Vine marketing tips.”

Eventually, you want to delete it. There is no point in keeping useless content on your blog.

Principle #22: You can’t set it and forget it

Similar to killing some of your irrelevant content, you’ll also have to update your older content.

As your content gets outdated, you’ll want to keep it fresh or people will find that it’s useless and bounce away.

This, in turn, will screw up your user metrics (bounce rate, time on site, page views per visitor) and reduce your credibility and traffic.

If you are going to blog, be willing to put resources into updating your older content as well. It’s something that most bloggers don’t take into account when starting. I have started to embrace this strategy as I have thousands of articles on this blog and a lot of them are older and need updating. I have already started with the updating process and will focus a lot of 2019 on keeping my content as fresh as possible.

Principle #23: People won’t come back to your blog unless you ask them to

The best visitors are repeat visitors. They are more likely to comment, link to your site, share your content on the social web, and convert into a customer.

No matter how good your content is, people won’t just come back unless you ask them to.

The easiest way to do this is through emails and push notifications.

By using tools like Hello Bar, you can easily collect emails and send out a blast every time you have a new post. And tools like Subscribers will allow you to build a push notification list.

Don’t start a blog without building an email list or push notification list. You’ll find that people who opt-in to them are much more likely to convert into customers. So, build this from day 1.

Principle #24: Don’t wait too long to monetize

A lot of bloggers (including me) have made this mistake. We all wait till we have tons of traffic to monetize. But if you go years before trying to monetize, people will assume everything on your blog is free.

In other words, you are training your readers that they shouldn’t pay for anything. And that’s fine if you have no plan to sell anything.

But you should train them early on that not everything is free. This will make your revenue numbers better as you grow.

Principle #25: Have multiple monetization strategies

You can’t rely on only one monetization strategy such as affiliate marketing or AdSense. Sometimes things happen that aren’t in your control such as an offer gets shut down or AdSense bans you and they don’t give you a reason.

Not only is it a safer strategy to have multiple monetization methods you’ll also make more money.

For example, some people won’t click on ads, while others may prefer buying an e-book from you.

When you start your blog, think about all of the monetization methods you want to try out and plan out how you are going to test them out (as not all of them will work).

Principle #26: Always include a personal touch

If you can’t write with a personal touch, then don’t write. Whatever you decide to blog about, make sure you can tie in a personal story.

People prefer reading content that has stories versus content with just facts and data.

If you don’t have personal stories that you can tie in, that means you are probably blogging on the wrong subject.

Principle #27: Be willing to pay the price

Blogging isn’t easy. It’s no longer a hobby where you can just write whenever you want and do well.

If you want to succeed, you have to be willing to put in the time and energy. And if you can’t, then you have to be willing to put in money.

If you don’t then you won’t do well, no matter how brilliant of a writer or marketer you are.

Really think about if you are willing to put in hours each day into making your blog successful. And are you willing to do that for a few years? Or are you willing to hire someone from day 1 to help out?

This isn’t a principle you need to take lightly, and it is the biggest reason most bloggers don’t make it.

Conclusion

Everyone talks about blogging from a tactical standpoint. From how you write content to even how to market it, but very few people talk about strategy.

If you don’t follow the above principles, you’ll find yourself spinning your wheels and creating a blog that doesn’t get any traction.

And if you happen to be lucky to gain visitors without taking into account the above principles, you’ll find that they won’t convert into customers.

So what other principles should bloggers follow? Just leave a comment below with some of the principles you follow.

The post If I Had to Start a Blog From Scratch, I Would… appeared first on Neil Patel.

Read More

Barnacle SEO in 2019: A Short and Comprehensive Guide

Barnacle SEO in 2019: A Short and Comprehensive Guide

What is Barnacle SEO in 2019

If you are a small local business, or just a new business trying to get a hand into SEO you might want to try out barnacle SEO.

What is barnacle SEO?

Well, let’s put it this way.

Go ahead and type in some kind of keyword from your niche to the Google Search Bar.

Look at the top 10 results. It’s all the big names in the industry right? Do you think you can fight them for the top 10 places? You can try, but it’s gonna be hard.

Admittedly the whole search ecosystem now is much much different compared to 10 years ago. But you can still see the list of big names, informational or directory sites seating on the organic, non-paid positions.

So instead of trying to fight them, why not try to take advantage of them instead?

That is barnacle SEO.

It’s basically a game of leverage. Where you’ll use an influential and highly visible website to promote your own business. So you’ll be ranking and getting the exposure that otherwise can’t be achieved.

Barnacle SEO is a term coined by Will Scott of Search Influence, back in the early 2000s. You can check out the original article here if you’re interested, Barnacle SEO – Local Search Engine Optimization for The Sam’s Club Crowd.

It’s an old article but the skills noted are still absolutely relevant to jump-start your local business’s search engine presence.

Sounds good? Hooked? But not quite sure how to start?

Let’s show you some ways to start on barnacle SEO.

1. Look for high ranking directory sites

Now obviously before you start anything, you need to know what are the big names or big sites that are currently sitting on that first page comfortably for your keyword, your niche, your industry.

I have a couple big names coming off my mind, for the SEO industry, it’s gotta be Moz, Search Engine Land, and Search Engine Journal to name a few.

For the hotel industry, there are tons of directory style websites that you can piggyback on like booking.com, Trivago, Agoda and much much more.

Now that you have a list of big names that have a high enough domain authority, and a big number of quality backlinks to have them constantly placed on the top of the SERP.

The next thing you need is to identify how you can get on their site.

If it’s a directory site, you can request to have your business listed on their site.

If it’s a website like Moz, where they run a blog or an active community, you can try reaching out for a guest posting.

How can you do that?

Well, that brings us to point number 2.

2. Guest posting

Guest posting is an essential step to get yourself a place in already high ranking non-directory websites.

But the main idea behind this is you’re looking for organic, inbound traffic.

Organic, inbound traffic starts with content.

If you can get a kick-ass content, that’s great.

If you can get a kick-ass content, and make it as a guest posting on an already high ranking website, even better.

Because you can know for sure that your content is gonna get the readers it deserves.

You may ask, ok so the other website is actually getting all the views from my content, how is that a good thing? Because that is actually what you have on your plate to bargain for a place on their website.

Well, you see, reputable websites will make it clear that firstly it is a guest post and secondly who wrote this awesome content and finally where you can reach this amazing writer directly.

And that’s why you need to find a reputable, high ranking website that will have all these rules that they adhere to that ends up helping you.

That’s the essence of barnacle SEO. In this industry, we help each other out right? (right.)

The first step to guest posting is outreach.

Basically, you’re reaching out to all these sites, showing your interest in guest posting on their website.

And it goes from there.

Like any sincere communication, an outreach will only be successful if you are actually offering them something of value, not simply asking to barge into their website to drop a link.

Keep in mind that, even by leveraging their website’s domain authority, a content won’t rank if it’s not of a good quality and provides the answers that the readers want.

3. Content sharing sites

Now that you have an awesome content that is getting some steady views and climbing up the SERP as we speak.

Why not use that same content and get more views? This time on a different website? There are a whole different set of big players in a slightly different niche.

If your content was in a blog post format, it also has the potential to be presented as a slideshow or even a video.

For a site like Slideshare. All you need to do really is to just spin your content into a presentation and share it.

Now, a better way is to do this the other way round.

You got a kick-ass content, you made them into this easy to share and easy to understand set of slides, embed it in your content itself and share it to the big name site that you’re guest posting on.

You can also make it into a video, for example, Moz’s whiteboard Friday showcases videos where experts talk about all sorts of SEO topics.

Upload it to YouTube, Vimeo or your Facebook. That way you can broaden your reach to a different crowd.

You can also opt for content syndication. There are often articles crossed posted on sites like medium and LinkedIn Groups, so that’s definitely doable.

By simply syndicating, you also spend less effort since you don’t have to reformat your content.

You’re also putting your content on multiple big websites and greatly increased your exposure.

So now the readers have more places to find your awesome content.

Which means more visitors and more exposure.

I call it awesome.

One thing to keep in mind is to check the canonical options you have while publishing your content on a site like medium.

Here’s a medium help page on SEO and duplicate content that goes into the details on how they handle this issue.

Not quite sure what canonical means? Check out this post here for a quick explanation, What Are Canonical Links And Why You Should Canonicalize Your URL

4. Comment on web 2.0

If you have a blog, you’ve most probably had some unwelcomed spam comment.

We’re kinda going that path. As in we’re gonna go around leaving comments about our own business too.

Now, like any other sane people on the planet would know. No one likes spammy comments. A comment saying “Thank you for this awesome article. www.ilovemilk.com” is not gonna cut it.

That’s downright offensive.

No one likes spam…

The main point of dropping comments on web 2.0 is not to force people to go visit your website, it’s even further away from dropping a link hoping to get that link juice.

It’s about building up your persona. Yes, yours.

You’re gonna share what you know about your own industry, and perhaps ask questions, engage with others.

The comment section of a piece of content is where the community is. You’re dropping a comment because you’re making yourself known as a part of the community.

And try to leave comments that are actually helpful and insightful, comments on sites like Quora or Reddit can end up ranking high for a specific keyword.

Treat your comment as if you’re writing a blog post. Give valid and actionable points. Then, when it’s backed up by the high domain authority of the website that you’re commenting on, you’ll have a bigger chance to rank.

Wait, here comes the main point.

If you come upon a situation where it’s totally ok to link your own content there, do it.

You have already established yourself as an active member of the community, not just a link dropper. People will actually take it seriously and pay it a visit, cause it’s actually gonna be relevant and reliable.

That way, you’ll have both link building and barnacle SEO by being active in one community, killing two birds with one stone.

5. Press Release

Press Release is an official announcement issued by a company or a business. Simple as that.

Digital press release service is often used as a great way to expand your brand exposure and pike interest from journalists.

A press release is not just one press release, there are usually plans selling press release service that syndicates it across hundreds of websites.

Do you see where this is going?

Sending out a digital press release is borderline white hat link building.

Syndicating a press release over hundreds of media sites would build up your link profile like mad.

And a bunch of businesses is using this exact technique to build their digital presence.

You should too.

So, are you on the board now?

Now, the first stop in this press release journey is, of course, looking for a reliable press release service provider.

Just a search on Fiverr and you’ll find a ton of sellers selling this service, are they the one you need? Well, that’s the million dollar question.

Fiverr sellers usually offer their service at a lower price. Downside? You can’t really expect how the quality will be.

Yes, you can kinda gauge by reading the reviews left by other buyers and you’ll only be losing a hundred dollar if they do a bad job.

You can also find bigger and more reputable press release service firms such as MarketersMEDIA. Do be prepared that the price listed will be double or maybe triple the price you see on Fiverr.

But hey you get what you paid.

When you’re paying for a press release service, you want the links to be varied, if they have a couple of big names on their distribution list, even better.

What difference with a press release with a content is that you don’t really need to pour days into creating an informational post.

A press release is simply an announcement, things are kept short to the point and professional.

Something as simple as announcing that your business now added SEO as a part of your marketing effort can be a good enough topic to write for a press release.

By sending out a press release you’re leveraging all the links you got from these websites to extend to your search engine visibility.

There are even services out there offering to syndicate your piece of content to top branded sites like Reuters, Google News, Yahoo News and more.

You can definitely make good use of them if you have the budget to jump-start your SEO effort.

Let’s conclude this post with an excellent guide on how to look for a suitable barnacle site. Since getting the correct site to jump on is the core of barnacle SEO.

Now start your journey!

Read More