Blog

WordPress Tips for Cost Effective Hosting

Posted by on Mar 28, 2019 in Greg's SEO Articles | Comments Off on WordPress Tips for Cost Effective Hosting

Before you setup a WordPress website, you will need hosting. To help you choose the most affordable kind, here are a handful of tips which will help in picking a cost-effective and highly reliable best WordPress hosting.

UNDERSTAND YOUR HOSTING REQUIREMENTS

Hosting providers offer an assortment of hosting types like shared hosting, VPS hosting, re-seller hosting, etc.

For most small-scale WordPress sites, shared hosting (a type of hosting where numerous websites are hosted on the same server and you share bandwidth and other resources with the other sites hosted there) is the best and least expensive option. this is best for new sites as bandwidth and storage requirements are quite low on startup, and you can choose to move to a bigger plan whenever you need to.

SELECT A PROVIDER THAT OFFERS ONE-CLICK INSTALL

Many hosting providers offer one-click installations which make it much easier for people to install WordPress on the host.

UNDERSTAND THE MONEY-BACK POLICY

Most hosting services offer money-back guarantees for a limited time – it makes it easier for you to cancel hosting and get refunded if it does’nt meet your expectations.

LOOK FOR GOOD CUSTOMER SERVICE

If you’re new to WordPress and/or web hosting, you probably will frequently run into issues and need professional help.

COMPARE PLANS AVAILABLE; FIND OUT ABOUT RENEWAL PRICING

Hosting providers offer dead cheap pricing for first-time buyers, but then end up with higher pricing when they renew their services. Doing a web search, you’ll be able to find the renewal price beforehand and pick a provider depending on how much yo’re willing to pay.

Is Javascript Bad For SEO?

Posted by on Mar 25, 2019 in SEO Articles | Comments Off on Is Javascript Bad For SEO?

Is Javascript Bad For SEO?

Does a bear poop in the woods? With javascript and SEO, the answer is just as clear, if a little more complicated.

Javascript-driven sites aren’t bad for indexation. Google can crawl a site that populates content client-side.

Javascript-driven client-side content is bad for SEO. Javascript-driven sites make Google work harder. At the very least, Google renders them more slowly. In the SERPs, that’s a competitive disadvantage.

To demonstrate (Ian rubs his hands together) I get to use quadrant diagrams.

If you already know how javascript works, what client-side rendering is, and how Google handles client-side rendering, print this, stick it to your forehead, and move on:

quadrant

The javascript/SEO quadrant

For us mere mortals, here’s a fuller explanation:

Two Types Of Javascript

There are two ways client-side javascript—javascript executed by a web browser—can interact with web content:

UI enhancement changes the way the browser interacts with content rendered on the server. Examples include tabbed, drop down navigation, and (sigh) carousels.

Client-side rendering delivers pages and content separately. Your web browser uses javascript to merge the two.

This post talks about client-side javascript rendering and why it’s bad for SEO.

Client vs. Server-Side

Every web page is an interaction between a client (a browser, like Chrome, or a bot, like Google) and a server.

Generating a web page involves three steps:

Fetch the page template (the layout)
Fetch the content
Merge the content with the template

Server- and client-side rendering perform these three steps differently.

Server-side rendering does all three steps on the server, then sends the result to the client. The client has everything it needs, renders the full page, and goes on its merry way.

Client-side rendering uses javascript to split the labor: It sends the template to the browser or bot, then sends the content separately. The browser uses javascript to merge the content and the template. Client-side rendering has advantages: It’s speedy (if you do it right). It’s a great way to build interactive applications. But it requires more work by the client.

Here’s our quadrant diagram so far:

quadrant-server-client

Server and client rendering

Static Content vs. Dynamic Interface

Some pages are just stuff: Words and pictures and links and buttons. Clicking those links and buttons send me to another page or display a form. They don’t profoundly modify the page itself. That’s static content, and it’s what you browse 90% of the time: Articles, product pages, blog posts, news, etc.

Other pages change a lot depending on my actions: A text editor, a multi-faceted search, or a page where content continually updates. A page like this is a dynamic interface. The Portent Title Generator, built by some incredible agency (cough) is an example:

javascript-title-generator

A dynamic interface using javascript

Hopefully, your SEO strategy doesn’t hinge on dynamic content. If you’re going to succeed in SEO, you need to get your static content indexed and optimized.

Static vs. dynamic is the next part of the quadrant diagram:

quadrant-all-four

Static, dynamic, client- and server-side

When you combine static/dynamic and server-side/client-side, you get a feel for where and how javascript can make SEO more difficult.

When Javascript Hurts SEO

Javascript is terrible for SEO when you use client-side rendering for static content:

quadrant

The javascript/SEO quadrant

Here’s why:

Static content is what you need indexed. If you can’t get a key product into the rankings, if your blog post is invisible, you’re hosed. Fortunately, Google crawls and indexes javascript-driven static content. All good.

You also need static content optimized: You need higher rankings, and that content is how you’ll get there. The trouble starts here. Google uses two-stage rendering on javascript-powered websites: It crawls the site now, renders content later. Here’s how Google’s engineers put it:

“The rendering of JavaScript powered websites in Google Search is deferred until Googlebot has resources available to process that content.”

That’s in Google’s own words at io2018. Check the video at 14:11.

Two learnings:

Google needs extra resources to fully crawl, render and index javascript-powered, client-side rendered pages
Google felt it necessary to point out that fact

Client-side rendering doesn’t hurt indexation. It hurts SEO. There’s a difference. As I said, Google can crawl javascript content, and it does. But two-step rendering puts client-side content at a competitive disadvantage. All these quadrant diagrams are making me giddy:

quadrant2

Indexing vs. SEO

If you’re doing SEO, you can’t afford to end up in the bottom-right box.

If you must use client-side rendering on static content, here are two ways to reduce the damage:

Mitigation

If you must use javascript, mitigate it using prerendering or hybrid rendering.

Prerendering and user-agent detection

Prerendering works like this:

Render a server-side version of each page on your site
Store that
When a client visits, check the user agent
If the client is a search bot, deliver the prerendered content instead of the javascript-rendered content

The logic is licking-your-own-eyeball-from-the-inside tortured: If you can deliver prerendered content, why not just do that from the start? But, if you must, try Puppeteer to do prerendering, or a service like prerender.io, which does all the work for you.

Hybrid rendering

Hybrid rendering generates the first page/content server-side, then delivers remaining content client-side. Sort of. Most javascript libraries, such as Angular, support this. I think.

If you search for “hybrid rendering,” you’ll find seven million pages, each with a slightly different definition of “hybrid rendering.” For our purposes, assume it means “Deliver the most important content, then the other stuff.”

For example, you could use it for filtering. Coursera lets you filter courses without javascript:

But the interface gets speedier, and the results richer, if your browser supports javascript:

That’s not the best example. TRUST ME that hybrid rendering mixes javascript-driven and static content, delivering static content first.

When To Use Which

For static content, use server-side rendering or, if you must, prerendering. If you want to optimize content that’s in a dynamic interface (like Coursera’s course list), use hybrid rendering.

ONE LAST QUADRANT DIAGRAM:

Why Mitigation Sucks

My rule: If Google gives you ways to mitigate a thing, don’t do that thing at all.

You know your doctor can set a bone. That doesn’t mean you go out of your way to break your leg for giggles.

Google can handle javascript-driven sites. That doesn’t mean you go out of your way to render content using javascript.

If nothing else, remember that Google changes their mind.

But I am not a javascript hater. In some cases, javascript-driven pages make a ton of sense.

When You Should Use Javascript Rendering

Build a client-side javascript-driven website when interactivity is more important than rankings. Apps and app-like websites, aggregators, and filters require client-side javascript rendering. Then use hybrid rendering to deliver critical content to Google.

When You Shouldn’t Use Javascript Rendering

Don’t use javascript for static content. If content isn’t interactive—a basic product page, a blog post, news articles, and any other content that doesn’t have to instantly respond to user input—it doesn’t need client-side javascript.

That doesn’t include carousels and other stuff. That’s UI enhancement, not content delivery. Done right, it’s perfectly OK.

Testing

This will bunch up the undergarments of many SEOs, developers, search scientists, and engineers: Don’t test.

Tests make you feel better. They show you that Google can indeed render the content. Great! Hooray for you!

No. Boo for you! Because testing verifies indexing and rendering. It does not verify that you’re competitive.

If you’re using client-side javascript to deliver static content you’ve failed the test. Stop. Change it.

Ask Yourself Why

There are two lessons here:

Javascript can be bad for SEO
There’s a difference between SEO and indexation

If you want to compete in the rankings, don’t use client-side rendering to deliver static content, or any content for which you want to rank. Use javascript to drive app-like experiences. When you’re considering using javascript to deliver content, do a very honest assessment of the pluses and minuses.

Then remember this handy quadrant diagram. I put a lot of time into this:

quadrant

The javascript/SEO quadrant

The post Is Javascript Bad For SEO? appeared first on Portent.

5 Most Important WordPress Blog Hosting Ingredients

Posted by on Mar 25, 2019 in Greg's SEO Articles | Comments Off on 5 Most Important WordPress Blog Hosting Ingredients

Originally developed as only a blogging tool, it’s become a robust content management system (CMS) which now powers a whopping 32.3% of all websites on the internet! It’s no wonder since it’s free, open source, easy-to-use, highly customizable, SEO-friendly, and fairly secure.

All you need to do to keep it running properly is consider the following five factors when choosing WordPress hosting plans.

1. SPEED
Not only does site loading speed make for stellar user experience, but site speed is also a confirmed ranking factor, meaning faster website translates to higher rankings in Google.

Good service providers will be able to keep data transfer times low even under heavy load. You can check the load on your server’s CPU, RAM, and disk space usage in your cPanel.

2. SUPPORT
Whether you’re a newbie or a WordPress wizard, excellent customer support is crucial for your blog’s long-term success. You’re going to face issues like internal server errors, connection time-outs, the “white screen of death”, etc., so you will need a host that employs knowledgeable and prompt support staff.

To find this out, first check if the hosting provider is marketing itself specifically towards WordPress users; then check whether they provide enough help/support documentation and WordPress-specific guidance; lastly, verify how many channels of support the hosting provider offers: live chat, ticket system, telephone, forums, and emails.

3. SECURITY
WordPress, being so popular as a CMS, does naturally fall victim to a large portion of brute force attacks and other hacking isses every year. The faster you recuperate from a hack, the less severe its repercussion on your website and users. In many cases, you will require your hosting to help with the recovery efforts, so you have to understand how quickly your hosts’ support team responds, how You can reach them, and how competent their support resources are. Make sure your WordPress hosting provider offers all of these security features, if not more:

. Firewalls
. DDoS Protection
. Virus Protection
. Security Protection
. Spam Filter
. SSL Security Certificate
. Domain Name Privacy

4. SCALABILITY
Managing expectations is important in the world of blogging, but the sky’s indeed the limit – with a whole bunch of blood, sweat, and tears, your blog can grow into a massive business. So, whichever hosting provider you choose should allow you to effortless upgrade your plan for more web space.

5. RELIABILITY
Customer reviews and surveys can tell you a lot about the provider’s reliability.

Managing Disk Space For Your WordPress Website

Posted by on Mar 21, 2019 in Greg's SEO Articles | Comments Off on Managing Disk Space For Your WordPress Website

Hosts today offer more space simply because of the significant decrease in the cost of storage space. In fact, the cost of bandwidth has been dropping as well.

Unfortunately, file sizes affect your site in ways other than just cost – the slower your website, the fewer visitors you’re going to have, so you have to keep file sizes as small as possible.

REDUCE IMAGE SIZES

You can do this by running a plugin like ReSmush.it or by manually resizing the images themselves using an image editor on your PC. You can even reduce your image size by using Optimizilla. Even a program as simple as Microsoft Windows Paint can do this effortlessly too.

REDUCE VIDEO SIZES

Videos are great as they can be very captivating visually, but they can also take up large amounts of space.

Reduce the video’s length and/or lower its resolution, and you’ll have a much faster loading website! This can be done very easily even with just Microsoft Windows Movie Maker.

REDUCE AUDIO BITRATE

Whether you have audio running on your website, or it’s part of a video on your site, you might want to reduce its bitrate; in other words, convert it into a lower-quality file (e.g. take a file from 320kbps to, say, 128kbps). This can be done using open-source programs like Handbrake.

How To Manage WordPress Disk Space For Your Website

Posted by on Mar 21, 2019 in SEO Articles | Comments Off on How To Manage WordPress Disk Space For Your Website

How To Manage WordPress Disk Space For Your  Website

Given that so many web hosting service providers are offering generous disk space today, the thought of how much space your WordPress site needs may not have crossed your mind. Hosts today offer more space simply because the cost of storage space has gone down significantly over the years. As a matter of fact, the cost of bandwidth has also been dropping.

Unfortunately, file sizes affect your site in ways other than cost – they increase the loading time of your site and that is where it really hurts. While finding the right WordPress host to grow with certainly can mitigate some performance issues, it remains in your best interest to keep file sizes reasonable.

What is Disk Space

Normally it is referred to as the total number of bytes that a disk is capable of holding. In the website case, it is the space you purchase from the web hosts.

When you created a new page, upload a new image or install a new plugin, it gets stored on the disk space. The more files you uploaded to your disk space or server, the more full it becomes.

Eventually, you will need a bigger capacity for your website.

How to Manage File size and Free Up Disk Space

Most of the space taken up by websites lies in multimedia files, which means either images or video, with the former more common. Knowing that, let’s see how we can get manage these file types and clean up your WordPress to optimize your site performance more

Reducing Image sizes

Images sizes are flexible, and sizes of these files can range from a few kilobytes to a few megabytes each. What usually affects the size of an image is its dimensions, resolution and type. As a rule of thumb, the larger the size of an image the longer it will take to load.

There are many ways in which you can optimize your images to speed up the performance. These range from using plugins to automatically resize images or even doing it yourself manually. Today we will explore three ways in which you can optimize image files.

Manually Resizing Images

There really isn’t any way to describe this than to say it is as easy as opening an image in an image editor and then resizing it there. Even the most basic image editors like Paint in Microsoft Windows can do this painlessly.

Take for example the image below. The original size was over 3MB with dimensions of approximately 3000 by 2000 pixels. Once I opened it up in Paint and scaled the dimensions down to 750 by 480, the file size dropped dramatically down to 72Kb – a massive saving in space!

Caption: Paint lets you easily change the dimensions of image files

Optimize Online

There are also online tools which enable you to resize and optimize images. These vary in capabilities from simply helping you change image dimensions to even stripping unnecessary code out of the image to streamline them.

One tool I have used before and found useful is Optimizilla. This online image optimizer not only helps you change image dimensions but also lets you choose what quality level you want to keep the image at. This gives you a great amount of control over the image.

You can even change file formats to take advantage of different image types. It also isn’t necessary to manage your images individually as the tool allows for batch uploads and downloads as well as processing.

Use A Plugin

If the two methods I’ve discussed above don’t suit you then you can always revert to using a WordPress plugin. Although plugins tend to take up some site resources themselves, they can make image optimization very easy for you.

Take for example WP Smush, one of the more popular image compression WordPress plugins around. All you need to do is upload and configure it and the plugin will automatically manage all image optimization for you in the future.

Personally, I find that most plugins, while useful, are typically not as efficient as optimizing your images offline. However, it is a time-versus-benefit thing and you will need to decide for yourself which is more critical for your site.

Reducing Video sizes

If you think that image files are large and unwieldy then it is time to meet the biggest space hog on the Internet – the video file. Videos are great because they can be extremely visually captivating but they also take up large amounts of space.

However, if you really need to use videos there are several ways you can keep their space usage under control as well.

Reduce Video Length

By keeping the length of your videos shorter you are not only training yourself to make full use of shorter time periods, but also controlling size. Long videos also tend to lose viewers as they have limited attention spans. If you have longer videos that you need to share, try cutting them up into segments and posting them as a series.

Lower the Resolution

Like images, resolution of videos affects size greatly. Lowering the resolution of a video clip can make a huge difference in file size. If you feel that this may affect the quality of experience you are trying to pass on to your viewers, then allow them to choose the resolution they wish to view the video at.

Control Audio Bitrate

Another way of reducing video size is by reducing the audio bitrate. Videos are made up of a combination of audio and video and these two components can be controlled separately. You can also consider changing the audio codec to manage compression of that part. Not all videos need ultra-high-quality audio to be great.

As with managing image files, there are also several tools you can use to edit videos and optimize them. Some are easier to use while others more complex, so choose one which suits your skill level and needs.

Some video utilities you can use are;

Handbrake
Pinnacle Studio
iMovie

Conclusion

As you can see, there are many ways you can optimize both images and video to lower the space taken up by your WordPress site. They are simple to use and, in many cases, can be completely free. The most important thing to remember is not to take images and video for granted. Even with unlimited space, using bulky images and video can result in terrible user experience. Also, it can be time-consuming if you were to move your WordPress.

The post How To Manage WordPress Disk Space For Your Website appeared first on WP Fix It.

10 Quick Wins We Can Make Using ODN as a Meta CMS

Posted by on Mar 21, 2019 in SEO Articles | Comments Off on 10 Quick Wins We Can Make Using ODN as a Meta CMS

10 Quick Wins We Can Make Using ODN as a Meta CMS

The Distilled Optimization Delivery Network (ODN) is most famous for SEO A/B testing and more recently full-funnel testing. But fewer people are familiar with one of the other main features; the ability to act as a meta-CMS and change pretty much anything you want in the HTML of your site, without help from your development team or writing tickets. DistilledODN is platform independent, sitting between your website servers and website visitors, similar to a Content Delivery Network (CDN), as shown in the below diagram.

This use case for ODN has been popular for many of our enterprise clients who have restrictions on their ability to make on-the-fly changes to their websites for a variety of reasons. A picture (or a gif) is worth a thousand words, so here are 10 common website changes you can make using ODN that you may not be aware of.

We’ve used a variety of websites and brands that use different platforms and technologies to show anyone can make use of this software regardless of your CMS or technology stack.

Before we get started, there is some jargon you will want to understand:

Site section: A site section is the group of pages that we want to make a specific change to

Global rules: These are rules that you want to apply to all pages within a site section as opposed to only a percentage of pages (like you would with an experiment). An example might be something like “Insert self-referencing canonical”. Rules are made up of individual steps.

Steps: These are nested within global rules, and are the steps you have to take to get to the end goal. Some global rules will only have one step, others can have much more.

In the example global rule above, the steps could be something like, “Remove existing canonical”, “Replace with self-referencing canonical”

On-page values: On-page values are constant values that we extract them from the pages in the site section. You can use these in steps. So for the above rule, we’d have to create two on-page values the “existing canonical” and the “path” of the URL we want to add the self-referencing canonical to. An example site where we’ve done this is included below.

The image below shows how these different components interact with each other.

If you’d like a more detailed explanation about any of this stuff, a good place to start is this blog post; what is SEO split-testing.

Now that you’re familiar with the terminology, here are our 10 common website changes made with ODN, with GIFs:

1. Forever 21 – Trailing slash redirect

Having URLs that return a 200 status code for both the trailing slash and non-trailing slash versions can lead to index bloat and duplicate content issues. On Forever21’s homepage, you can see both “/uk/shop” and “/uk/shop/” are 200 pages.

To fix this using ODN, we create a site section that has the homepage entered as the page we want our global rule to apply to.

Then we need to create an on-page value for the page without a trailing slash. In this example, we’ve extracted this value using regex. Having this value defined means that this fix would be easy to apply to a bulk set of URLs on the website if necessary.

Next, we create our global rule. This rule only has one step, to redirects the URL in our site section to the one created using the on-page value, {{path_without_trailing_slash}}.

2. SmartWater Technology – Duplicate home page redirects

Often, websites will have multiple versions of their homepage that return status codes, like when they have both an http:// version and an https:// version, or a www version and a non-www version. This is a problem because it means the authority of your strongest page is split across two URLs. It also means you may have a non-desirable version ranking in search results.

We can see this on SmartWater Technology’s homepage. We can fix this problem by deploying ODN on the non-www version of their site, and creating a site section for the homepage. We only have one page we want to work on in this example, so we don’t need to create any additional on-page values.

We then set up a global rule to redirect the non-www version of the homepage to the www version, which has one step. In the step we select to redirect the URL in our path list (the homepage), to the new destination we’ve entered, https://www.smartwater.com/.

3. Bentley – Adding self-referencing canonicals

As mentioned in the introduction, we can use ODN to insert self-referencing canonicals on a list of pages. We’ve done this with Bentley Motors as an example, which doesn’t have a canonical on their homepage (or any other pages).

We can fix this by setting a global rule with one step to insert this block of HTML after the <title> element:

<link rel=”canonical” href=”https://www.bentleymotors.com{{path}}”>

We didn’t have to create an on-page value for {{path}}, since it was created by entering the homepage in our path list. This rule will add a self-referencing canonical to any page that we include in our site section.

If we wanted to, we can also use ODN to apply canonicals that aren’t self-referencing by mapping out the pages we want to add canonicals to, with their canonical page as a value created with a csv upload.

4. Patagonia – Fixing soft 404s

Patagonia uses this landing page, that returns a 200 status code, for 404s, rather than a page that returns a genuine 404 status code. The problem with using soft-404s such as the one Patagonia uses is it won’t send the 404 signal to crawlers, even if the content on the page has the 404 message. This means search engines will see this as a real page, preventing the URL you intended to delete from being removed from the index.

To fix this using ODN, I’ve created a site section with the page path /404/. If you have multiple pages that are soft 404s, you can use other methods to define the pages in the site section. For example, you could match on any page that has “Page Not Found” in the title, or for Patagonia, we could use regex to match on any url that contains “/404/” in it.

Once we’ve defined what pages we want in our site section, we create a global rule with one step that changes the status code from 200 to 404.

5. Amazon Jobs – Changing 302s to 301s

When a redirect is truly temporary, using a 302 status code instead of a 301 makes sense; but if you’re not planning on reverting back to the original URL, using a 302 instead of a 301 redirect means you aren’t passing link equity from one URL to the next.

Once again, this fix is simple to deploy using ODN. We have done it with Amazon Jobs in the GIF below. First, we’ve created a site section with path of the URL we want to change the status code of. I have also changed the response code to match 302 rather than 200, which is the default for ODN.

Again, no need to create an on-page value in this instance. All that’s required is a global rule with one step, to change the status code on those URLs that match what we have in our path list from 302 to 301.

6. Etsy – Changing sitewide links that 30x/404

When you have a sitewide link that has a 30x or 404 status code, it not only might be a frustrating experience for users, it can also have a negative impact on your SEO. If a heavily linked to page on your site has a 301 redirect, for example, you are preventing it from being passed all the link equity available to it.

To fix this with ODN, we can replace the 301 link with the destination 200 link. We have done this on Etsy’s homepage in the GIF below.

First, we create a site section for the homepage, then a global rule with a step to replace the old blog URL. This step replaces the content of the element we’ve selected using a CSS selector with the HTML in the box.

In this case the css selector we have used is “a[href=”https://www.distilled.net/blog/uk/?ref=ftr”]”. Using the test feature, we can see this selector grabs the element “<a class=”text-gray-darker pt-xs-1 pb-xs-2 pb-md-1 display-block width-full” href=”https://www.distilled.net/blog/uk/?ref=ftr”> <span>Etsy blog</span> </a>”. That’s what we are looking to replace.

We then set it to replace the above element with “<a class=”text-gray-darker pt-xs-1 pb-xs-2 pb-md-1 display-block width-full” href=”https://blog.etsy.com/uk/?ref=ftr”> <span>Etsy blog</span> </a>”, which has the link to the 200 version of Etsy’s blog. Now the footer link goes to the blog.etsy URL rather than the 301 /blog/uk/?ref=ftr URL.  

7. Pixel Eyewear – Adding title tags

Changing title tags is often a desire for content creators, as metadata is one of the strongest signals you can send to Google on what your page is about and what keywords you want to target.

Say you worked at Pixel Eyewear, and after some keyword research decided you wanted to target the keyword “computer screen glasses”, rather than simply “computer glasses”. We can use ODN to make that update, and again this rule can easily be set to target a bulk set of pages.

In the path list, we include all the URLs we want this change to apply to. Then we create a global rule to add “Screen” to our page titles. This has one step, where we use the CSS selector to select the title element of the page. We then enter the HTML we want instead.

8. Pixel Eyewear – Adding content to product pages

This is an example of when a site section has multiple rules. Say that you worked at Pixel Eyewear, and you also wanted to update the descriptions on your product pages, in addition to adding “Screen” to your page titles, and you want to do this on the same pages included in the previous section.  

To do this with ODN, we create a second global rule to edit the product description. This uses a different CSS selector, “div[class=”pb-3″]”. You just want the main description to be more descriptive, so you replace the first paragraph of the element “Meet the most advanced eyewear engineered for the digital world.” to “Our most popular product, the Capra will have you looking stylish while wearing the most advanced eyewear engineered for the digital world.”

Since there are two global rules in this section, the order you place them in will matter. ODN works from top to bottom, as shown in the diagram in the intro, so it will apply the first global rule and its steps first before moving to the second. If one of your global rules depends on something created in another, you want to be sure that global rule is listed first.

9. Liberty London – Adding meta descriptions

Meta descriptions are an important meta property to entice users to click through to your webpage from the SERP, but it’s common for website owners to not have them at all, or on important pages on their site, as seen with Liberty London on their UK featured page.

We can edit the meta description content with ODN, and insert a description. First, we include the path of the target page in our path list, then create a global rule with a single step that grabs the meta description with a CSS selector. This time we set it to “Set or update the attribute of an element.” The attribute we want to replace is the content, and we want to replace it with the content entered.

This can also be used to add in meta descriptions when they’re missing entirely, or when you want to insert new ones. If you want to apply in bulk, you can upload a CSV that has the desired meta descriptions for each target URL as a value.

10. CamelBak – Removing duplicate content

E-commerce and other websites frequently wind up with duplicate content on their websites, which can lead to drops in traffic and rankings. Faceted navigation is a common culprit. We can see this in action on Camelbak’s website, where parametered URLs like https://international.camelbak.com/en/bottles/bottle-accessories?sortValue=af41b41832b34f02975423ad5ad46b1e return 200 status codes and have no canonical tags.

We’ve fixed this in ODN by adding canonical tags to the non-parameterized URL. First, we add the relevant URL paths to our path list. Then we need to create an on-page value for the non-parameterized version of the URL. This rule uses regex to extract the content of the URL that comes before the “?” character.

Once we have this on-page value, we can use it in our global rule. Since there are no canonicals already, this global rule has one step. If there were already canonicals on these pages, self-referencing ones, for example, that still referred to the parameterized URL, then we’d have to remove that canonical before we could add in a new one.

The step to add in the canonical inserts a block of HTML after the <title> element. Then we enter the HRML that we want to be inserted. You can see that this uses the on-page value we created, giving us this string:

<link rel=”canonical” href=”https://international.camelbak.com{{url_without_parameters}}”/>

Because we’ve used an on-page value, we put a list of paths for relevant parameterized URLs in our path list, and it will insert a canonical to their non-parameterized parent.

This tactic can be adjusted to account for pagination with rel=”prev” and rel=”next” tags and many other variations. Another way to address duplicate content issues with ODN is to redirecting unwanted URLs, among others.

Summary

These examples are only a selection of the types of fixes ODN can employ for your website. There are many more, in addition to being able to perform SEO A/B testing and full-funnel testing. The ability to create custom values and use CSS selectors means there’s a lot of room for any of these fixes to be customized to meet the needs of your website.

If you work on a website that has a difficult time being able to make these kinds of changes (you’re not the only one), then get in touch to get a free demo of our platform in action on your website.

Product page UX

Posted by on Mar 21, 2019 in SEO Articles | Comments Off on Product page UX

Product page UX

If you’ve done our All-around SEO training, you already realize that product page SEO isn’t just about optimizing your title and headings. Product page SEO is about making that product page as user-friendly as possible, making sure bounce rate is as low as possible and the page looks awesome in Google. It’s about product page UX (user experience) and technical optimization. This article is about the first part: product page UX.

Before we dive in, if you want to learn more about user experience (UX) and other essential SEO skills, you should check out our All-around SEO training! It doesn’t just tell you about SEO: it makes sure you know how to put these skills into actual practice!

In this post, I will show you a couple of great product pages. These pages have most if not all elements to make it a killer product page. Besides that, I’ll show you a number of more technical improvements that are absolutely necessary if you’re serious about product page UX.

Coolblue’s product page UX

SEO isn’t all about optimizing your meta description, although that seriously helps. In most cases, leaving a meta description blank will make sure Google creates the best automatic meta description it can make. For your product pages, you’d want to convince the visitor to click your link. Coolblue, one of the largest online retailers of the Netherlands, exploiting a huge number of specialized webshops, adds some triggers to every meta description:

Order the Philips 273V5LHAB at Coolblue. Ordered before 23:59? Delivered for free tomorrow. Coolblue: anything for a smile.

When I buy something online, I’d like it to be delivered a.s.a.p.. Makes sense to focus on that. Most of the competition in the Netherlands can’t match the USP of fast delivery like that.

Let’s look at an actual product page:

I added a couple of numbers here that I wanted to elaborate on:

Ordered before 11.59PM, delivered the next day for free. I want it and I want it now, or at least as fast as possible.6 stores. This is clearly a trust factor. I can go to a store in case of issues, which makes it easier to spend a higher amount of money.Ratings and reviews. We did an article on testimonials; these are like that.Delivered tomorrow. So small, yet so valuable. It’s available right now. Add that to item #1 in this list and you know you want to buy from this webshop.Primary and secondary call-to-action. Very important. Add a main call-to-action and an alternative. (In this case Buy now, or On wish list). You’ll find this on most online shops for a reason.All kinds of trust indicators like ‘can be returned for free within 30 days’, ‘customer service available until 11.59PM’ and so on.Alternative views. To give the visitor a shop-like experience, you’d want the visitor to be able to ‘hold’ the product and look at it from multiple angles.Product bundles. Everybody is looking for a bargain or a nice deal, right? That’s not just something we Dutch people do 😉

All these user-focused elements will make the user like the webshop, and this will make Google like the website as well. As mentioned, Product page UX isn’t just about adding the right meta description or headings. Coolblue does an awesome job on product page UX, in my honest opinion.

Amazon’s Product Page UX

Coolblue’s product page is actually a bit like Amazon‘s. Here it is:

There are certainly similarities, as you can see. Let’s go over the numbers:

Free two day delivery. OK, for this specific product, I’d order at Coolblue, but given the fact that Amazon ships abroad, makes that two-day delivery pretty awesome.Alternatives. I like this. It allows me to order at the supplier I have a great experience with, instead of just going for the cheapest that is usually displayed first.Ratings and reviews. Huge numbers of these, what makes me trust these ratings and reviews even more.In stock. Like at Coolblue’s.Call-to-action and alternatives. Amazon actually offers an option to sell your own Apple iPhone 6 for a Gift Card as a third option.X answered questions. It’s comforting to know that the actual seller takes the time to help you out if needed.Alternative views. See Coolblue’s.Product bundles. As at Coolblue’s.

Besides these elements, Amazon has a huge advantage on other online resellers: an awesome, well-known brand. And that most certainly helps a lot too.

Last but not least, I’d like to show you a slightly different shop, named ThinkGeek.

ThinkGeek’s Product Page UX

Both the above shops, Coolblue and Amazon, show how most product pages should be set up. But there are also a lot of online retailers that don’t offer all the options mentioned before, like stock, delivery advantages and reviews. Some shops sell niche products, and don’t need these ‘extra’ triggers. ThinkGeek is one of these shops.

This last page looks much cleaner and more focused. It lacks a number of elements that Amazon and Coolblue did add to their product pages, but I am sure loads of people will prefer the clean and focused appearance of ThinkGeek’s product page. Product page UX is also about focusing on what’s most important. ThinkGeek does add a number of extra elements I’d like to mention:

Free shipping on orders over $75. Or whatever amount seems reasonable for your business. Just the other day, I ordered 12 plectrums I just did not need per se to match the $50 that would get me free shipping. It seems nice, but actually is also just another trigger to make you want to buy more.Customer action shots. Showing actual people using your products, makes that people see themselves using you product. It’s a nice addition!Social proof. Only works with a certain number of shares, obviously. It’s like reviews and ratings, but without the reviews and ratings.In stock. There it is again.Call-to-action and an alternative. I like the way ThinkGeek designed this. The main orange button and the smaller, secondary black button work really well together.People also bought these products. That’s the bundle without a discount. Might be obvious, but showing related products could lead to extra sales per visit.Alternative views. No matter how cheap the product, provide alternative views on the product.Last but not least, ThinkGeek added a pretty large product description to accompany this quite simple, dull product. It shows that ThinkGeek takes their optimization very seriously and realizes that content and content SEO is just as important as creating a nicely looking page for their products (judging by this page).

What about your own product page?

I trust this article’s got you thinking about your own page. If you are an online retailer, I’d love to know what you did to optimize your page. Did you add that in stock option? Did you add urgency by listing only 3 items left? Drop me a line in the comments. I’m looking forward to it.

The post Product page UX appeared first on Yoast.

LinkedIn taps Bing search data for interest targeting

Posted by on Mar 20, 2019 in SEO Articles | Comments Off on LinkedIn taps Bing search data for interest targeting

Microsoft-owned LinkedIn is expanding its interest targeting capability with Bing search data.

Why you should care

Bing has started incorporating LinkedIn data for search ad targeting. Now, we’re seeing search data be used for targeting on LinkedIn for the first time since Microsoft acquired the B2B social network in 2016.

Advertisers will be able to target LinkedIn users based on the professional topics and content they engage with on Bing as well as the professional interests they’ve indicated on LinkedIn.

LinkedIn launched interest targeting in January, allowing advertisers to target users who have indicated professional interests. It launched with more than 200 topics, such as AI, customer experience and global economy. Interest targeting can be used together with account targeting.

More on the news

Linked in also introduced lookalike audiences to help advertisers expand their audience targeting and prospect to those who “look like” their existing customers.
New audience templates launched Wednesday as well. Designed for newer LinkedIn advertisers, they offer a selection of more than 20 predefined B2B audiences with characteristics such as skills, job titles and groups that can get activated quickly
See our complete coverage of the updates on our sister site Marketing Land.

The post LinkedIn taps Bing search data for interest targeting appeared first on Search Engine Land.

Google Florida 2.0 Algorithm Update: Early Observations

Posted by on Mar 18, 2019 in SEO Articles | Comments Off on Google Florida 2.0 Algorithm Update: Early Observations

Google Florida 2.0 Algorithm Update: Early Observations

It has been a while since Google has had a major algorithm update.

They recently announced one which began on the 12th of March.

This week, we released a broad core algorithm update, as we do several times per year. Our guidance about such updates remains as we’ve covered before. Please see these tweets for more about that:https://t.co/uPlEdSLHoXhttps://t.co/tmfQkhdjPL— Google SearchLiaison (@searchliaison) March 13, 2019

What changed?

It appears multiple things did.

When Google rolled out the original version of Penguin on April 24, 2012 (primarily focused on link spam) they also rolled out an update to an on-page spam classifier for misdirection.

And, over time, it was quite common for Panda & Penguin updates to be sandwiched together.

If you were Google & had the ability to look under the hood to see why things changed, you would probably want to obfuscate any major update by changing multiple things at once to make reverse engineering the change much harder.

Anyone who operates a single website (& lacks the ability to look under the hood) will have almost no clue about what changed or how to adjust with the algorithms.

In the most recent algorithm update some sites which were penalized in prior “quality” updates have recovered.

Though many of those recoveries are only partial.

Many SEO blogs will publish articles about how they cracked the code on the latest update by publishing charts like the first one without publishing that second chart showing the broader context.

The first penalty any website receives might be the first of a series of penalties.

If Google smokes your site & it does not cause a PR incident & nobody really cares that you are gone, then there is a very good chance things will go from bad to worse to worser to worsterest, technically speaking.

“In this age, in this country, public sentiment is everything. With it, nothing can fail; against it, nothing can succeed. Whoever molds public sentiment goes deeper than he who enacts statutes, or pronounces judicial decisions.” – Abraham Lincoln

Absent effort & investment to evolve FASTER than the broader web, sites which are hit with one penalty will often further accumulate other penalties. It is like compound interest working in reverse – a pile of algorithmic debt which must be dug out of before the bleeding stops.

Further, many recoveries may be nothing more than a fleeting invitation to false hope. To pour more resources into a site that is struggling in an apparent death loop.

The above site which had its first positive algorithmic response in a couple years achieved that in part by heavily de-monetizing. After the algorithm updates already demonetized the website over 90%, what harm was there in removing 90% of what remained to see how it would react? So now it will get more traffic (at least for a while) but then what exactly is the traffic worth to a site that has no revenue engine tied to it?

That is ultimately the hard part. Obtaining a stable stream of traffic while monetizing at a decent yield, without the monetizing efforts leading to the traffic disappearing.

A buddy who owns the above site was working on link cleanup & content improvement on & off for about a half year with no results. Each month was a little worse than the prior month. It was only after I told him to remove the aggressive ads a few months back that he likely had any chance of seeing any sort of traffic recovery. Now he at least has a pulse of traffic & can look into lighter touch means of monetization.

If a site is consistently penalized then the problem might not be an algorithmic false positive, but rather the business model of the site.

The more something looks like eHow the more fickle Google’s algorithmic with receive it.

Google does not like websites that sit at the end of the value chain & extract profits without having to bear far greater risk & expense earlier into the cycle.

Thin rewrites, largely speaking, don’t add value to the ecosystem. Doorway pages don’t either. And something that was propped up by a bunch of keyword-rich low-quality links is (in most cases) probably genuinely lacking in some other aspect.

Generally speaking, Google would like themselves to be the entity at the end of the value chain extracting excess profits from markets.

This is the purpose of the knowledge graph & featured snippets. To allow the results to answer the most basic queries without third party publishers getting anything. The knowledge graph serve as a floating vertical that eat an increasing share of the value chain & force publishers to move higher up the funnel & publish more differentiated content.

As Google adds features to the search results (flight price trends, a hotel booking service on the day AirBNB announced they acquired HotelTonight, ecommerce product purchase on Google, shoppable image ads just ahead of the Pinterest IPO, etc.) it forces other players in the value chain to consolidate (Expedia owns Orbitz, Travelocity, Hotwire & a bunch of other sites) or add greater value to remain a differentiated & sought after destination (travel review site TripAdvisor was crushed by the shift to mobile & the inability to monetize mobile traffic, so they eventually had to shift away from being exclusively a reviews site to offer event & hotel booking features to remain relevant).

It is never easy changing a successful & profitable business model, but it is even harder to intentionally reduce revenues further or spend aggressively to improve quality AFTER income has fallen 50% or more.

Some people do the opposite & make up for a revenue shortfall by publishing more lower end content at an ever faster rate and/or increasing ad load. Either of which typically makes their user engagement metrics worse while making their site less differentiated & more likely to receive additional bonus penalties to drive traffic even lower.

In some ways I think the ability for a site to survive & remain though a penalty is itself a quality signal for Google.

Some sites which are overly reliant on search & have no external sources of traffic are ultimately sites which tried to behave too similarly to the monopoly that ultimately displaced them. And over time the tech monopolies are growing more powerful as the ecosystem around them burns down:

If you had to choose a date for when the internet died, it would be in the year 2014. Before then, traffic to websites came from many sources, and the web was a lively ecosystem. But beginning in 2014, more than half of all traffic began coming from just two sources: Facebook and Google. Today, over 70 percent of traffic is dominated by those two platforms.

Businesses which have sustainable profit margins & slack (in terms of management time & resources to deploy) can better cope with algorithmic changes & change with the market.

Over the past half decade or so there have been multiple changes that drastically shifted the online publishing landscape:

the shift to mobile, which both offers publishers lower ad yields while making the central ad networks more ad heavy in a way that reduces traffic to third party sites
the rise of the knowledge graph & featured snippets which often mean publishers remain uncompensated for their work
higher ad loads which also lower organic reach (on both search & social channels)
the rise of programmatic advertising, which further gutted display ad CPMs
the rise of ad blockers
increasing algorithmic uncertainty & a higher barrier to entry

Each one of the above could take a double digit percent out of a site’s revenues, particularly if a site was reliant on display ads. Add them together and a website which was not even algorithmically penalized could still see a 60%+ decline in revenues. Mix in a penalty and that decline can chop a zero or two off the total revenues.

Businesses with lower margins can try to offset declines with increased ad spending, but that only works if you are not in a market with 2 & 20 VC fueled competition:

Startups spend almost 40 cents of every VC dollar on Google, Facebook, and Amazon. We don’t necessarily know which channels they will choose or the particularities of how they will spend money on user acquisition, but we do know more or less what’s going to happen. Advertising spend in tech has become an arms race: fresh tactics go stale in months, and customer acquisition costs keep rising. In a world where only one company thinks this way, or where one business is executing at a level above everyone else – like Facebook in its time – this tactic is extremely effective. However, when everyone is acting this way, the industry collectively becomes an accelerating treadmill. Ad impressions and click-throughs get bid up to outrageous prices by startups flush with venture money, and prospective users demand more and more subsidized products to gain their initial attention. The dynamics we’ve entered is, in many ways, creating a dangerous, high stakes Ponzi scheme.

And sometimes the platform claws back a second or third bite of the apple. Amazon.com charges merchants for fulfillment, warehousing, transaction based fees, etc. And they’ve pushed hard into launching hundreds of private label brands which pollute the interface & force brands to buy ads even on their own branded keyword terms.

They’ve recently jumped the shark by adding a bonus feature where even when a brand paid Amazon to send traffic to their listing, Amazon would insert a spam popover offering a cheaper private label branded product:

Amazon.com tested a pop-up feature on its app that in some instances pitched its private-label goods on rivals’ product pages, an experiment that shows the e-commerce giant’s aggressiveness in hawking lower-priced products including its own house brands. The recent experiment, conducted in Amazon’s mobile app, went a step further than the display ads that commonly appear within search results and product pages. This test pushed pop-up windows that took over much of a product page, forcing customers to either click through to the lower-cost Amazon products or dismiss them before continuing to shop. … When a customer using Amazon’s mobile app searched for “AAA batteries,” for example, the first link was a sponsored listing from Energizer Holdings Inc. After clicking on the listing, a pop-up window appeared, offering less expensive AmazonBasics AAA batteries.”

Buying those Amazon ads was quite literally subsidizing a direct competitor pushing you into irrelevance.

And while Amazon is destroying brand equity, AWS is doing investor relations matchmaking for startups. Anything to keep the current bubble going ahead of the Uber IPO that will likely mark the top in the stock market.

Some thoughts on Silicon Valley’s endgame. We have long said the biggest risk to the bull market is an Uber IPO. That is now upon us.— Jawad Mian (@jsmian) March 16, 2019

As the market caps of big tech companies climb they need to be more predatious to grow into the valuations & retain employees with stock options at an ever-increasing strike price.

They’ve created bubbles in their own backyards where each raise requires another. Teachers either drive hours to work or live in houses subsidized by loans from the tech monopolies that get a piece of the upside (provided they can keep their own bubbles inflated).

“It is an uncommon arrangement — employer as landlord — that is starting to catch on elsewhere as school employees say they cannot afford to live comfortably in regions awash in tech dollars. … Holly Gonzalez, 34, a kindergarten teacher in East San Jose, and her husband, Daniel, a school district I.T. specialist, were able to buy a three-bedroom apartment for $610,000 this summer with help from their parents and from Landed. When they sell the home, they will owe Landed 25 percent of any gain in its value. The company is financed partly by the Chan Zuckerberg Initiative, Mark Zuckerberg’s charitable arm.”

The above sort of dynamics have some claiming peak California:

The cycle further benefits from the Alchian-Allen effect: agglomerating industries have higher productivity, which raises the cost of living and prices out other industries, raising concentration over time. … Since startups raise the variance within whatever industry they’re started in, the natural constituency for them is someone who doesn’t have capital deployed in the industry. If you’re an asset owner, you want low volatility. … Historically, startups have created a constant supply of volatility for tech companies; the next generation is always cannibalizing the previous one. So chip companies in the 1970s created the PC companies of the 80s, but PC companies sourced cheaper and cheaper chips, commoditizing the product until Intel managed to fight back. Meanwhile, the OS turned PCs into a commodity, then search engines and social media turned the OS into a commodity, and presumably this process will continue indefinitely. … As long as higher rents raise the cost of starting a pre-revenue company, fewer people will join them, so more people will join established companies, where they’ll earn market salaries and continue to push up rents. And one of the things they’ll do there is optimize ad loads, which places another tax on startups. More dangerously, this is an incremental tax on growth rather than a fixed tax on headcount, so it puts pressure on out-year valuations, not just upfront cash flow.

If you live hundreds of miles away the tech companies may have no impact on your rental or purchase price, but you can’t really control the algorithms or the ecosystem.

All you can really control is your mindset & ensuring you have optionality baked into your business model.

If you are debt-levered you have little to no optionality. Savings give you optionality. Savings allow you to run at a loss for a period of time while also investing in improving your site and perhaps having a few other sites in other markets.
If you operate a single website that is heavily reliant on a third party for distribution then you have little to no optionality. If you have multiple projects that enables you to shift your attention toward working on whatever is going up and to the right while letting anything that is failing pass time without becoming overly reliant on something you can’t change. This is why it often makes sense for a brand merchant to operate their own ecommerce website even if 90% of their sales come from Amazon. It gives you optionality should the tech monopoly become abusive or otherwise harm you (even if the intent was rather than outright misanthropic).

As the update ensues Google will collect more data with how users interact with the result set & determine how to weight different signals, along with re-scoring sites that recovered based on the new engagement data.

Recently a Bing engineer named Frédéric Dubut described how they score relevancy signals used in updates

As early as 2005, we used neural networks to power our search engine and you can still find rare pictures of Satya Nadella, VP of Search and Advertising at the time, showcasing our web ranking advances. … The “training” process of a machine learning model is generally iterative (and all automated). At each step, the model is tweaking the weight of each feature in the direction where it expects to decrease the error the most. After each step, the algorithm remeasures the rating of all the SERPs (based on the known URL/query pair ratings) to evaluate how it’s doing. Rinse and repeat.

That same process is ongoing with Google now & in the coming weeks there’ll be the next phase of the current update.

So far it looks like some quality-based re-scoring was done & some sites which were overly reliant on anchor text got clipped. On the back end of the update there’ll be another quality-based re-scoring, but the sites that were hit for excessive manipulation of anchor text via link building efforts will likely remain penalized for a good chunk of time.

Categories: google

Backlinks vs social shares: How to make your content rank for different SEO metrics

Posted by on Mar 18, 2019 in SEO Articles | Comments Off on Backlinks vs social shares: How to make your content rank for different SEO metrics

Backlinks vs social shares: How to make your content rank for different SEO metrics

A new study by Kaizen has revealed that content that performs well for backlinks does not necessarily perform well for social shares and vice versa.

Analyzing over 2300 pieces of finance content, Kaizen has found the best performing pieces of content for URL rating, the number of referring domains, and the number of social shares. Nine out of the top 10 pieces of content with the highest URL ratings also featured in the top 10 pieces of content for the most referring domains.

This shows a clear correlation between the two. The higher the quantity of referring domains, the higher the quality of URL rating.

The best-performing piece of content for both URL rating and the number of referring domains was the Corruptions Perceptions Index 2017, by Transparency International. The campaign highlighted the countries that are or are not making progress in ending corruption, finding that the majority of countries were making little or no progress.

But what made this campaign succeed so well in SEO terms?
1. It has global appeal

By placing emphasis on visual components of content, the campaign is easily understandable without language and is based on data from across the world, making it globally link-worthy.

2. It is emotional content

The piece evokes an emotional response from the element of corruption and the fact that the majority of countries in the world are making little or no progress in ending corruption.

3. It is evergreen content

Evergreen content” is content that is not tied to a specific date or time of the year and can be outreached (and can gain links) at any time. In addition, Transparency International is able to update the data each year, creating a new story for outreach and increasing its chances of landing links.

By combining these typical elements of viral content, the Corruption Index earned 6372 referring domains, and a URL rating of 84, making it the most successful piece of finance content in the study. Use these three aspects as a checklist for your own content, and it should emulate great results.

Social shares

The Corruption Perception Index also ranked in the top 10 pieces of content for social shares, with a grand total of nearly 48,000. However, it is only one of two pieces to rank in the top 10 for URL ratings or referring domains and social shares. There is much less correlation between social share success and backlink success, showing that they are not directly or significantly linked.

The most successful piece of content for social shares was this car insurance calculator by Confused.com, with 91,000 total social shares. This piece of content, as well as the majority of the top 10, is B2C-focused. In comparison, the URL rating and referring domains lists are more technical and B2B-focused.

Therefore, B2B content performs better for SEO strategies focused on backlinks, whereas B2C tools and guides suitable for customers rather than businesses perform better for social shares.

The Corruption Perception Index is an exception, performing well for both backlinks and social shares. However, by focusing on analytical data from experts and business people, and by providing relevant data for both businesses and customers, it has equal value for both B2B and B2C audiences.

In conclusion

Don’t expect the same piece of content to perform well for both backlinks and social shares. But, if you are able to create content that provides equal value for both B2B and B2C communities, you will have the opportunity for multiple outreach strategies, with resounding value throughout the industry.

Nathan Abbott is Content Manager at Kaizen.

The post Backlinks vs social shares: How to make your content rank for different SEO metrics appeared first on Search Engine Watch.