SEO Articles

How to Use Videos to Improve Your SEO and Ranks

How to Use Videos to Improve Your SEO and Ranks

Video marketing has blown through the roof in recent times. Marketers and brands are posting more videos online, and consumers like it.

People are 10x more likely to interact with video compared to block textOn a landing page, videos increase conversion rates by up to 80%Up to 90% of consumers who watch a video share it 

Above all, videos now directly impact your SEO performance and SERP ranking. Including a video within a web article or blog, for instance, increases, 53x, your likelihood of ranking on the first page!

This guide specifically focuses on the role of video in search engine optimization. We want to show you how to leverage video to boost your SEO performance.

Why is Video an Important Part of SEO?

There are three main reasons;

Videos enhance the depth and quality of your content 

Although it’s a much more complex process, the basics of search engine ranking involve crawling through all available content pieces and deciding what results to return based on how well each piece answers the searcher’s query.

Videos pack a ton of information. Thus, even search engines feel that pages with videos would answer a query more comprehensively. The results? Higher ranking.

Videos boost user experience, resulting in longer session lengths

Bounce rates are a real challenge for the majority of digital marketers. Bloggers, for example, experienced a 70-90% bounce rate in 2019, meaning that up to 9 out of 10 times, visitors didn’t read a blog post to the conclusion.

Videos are an excellent way to improve your bounce rates. Simply adding a video to a blog post reduces bounce rates by up to 34%.

Videos increase backlinks to your site

Although you can also earn backlinks with plain-text content, videos promise a lot more links. People are much more likely to link back to a video than a text block.

But, it’s not just about someone from another site linking to your site. Videos are also shared more, further increasing the potential for backlinks. If person A shares your video, person B, who comes across the shared video, can link back to it without even visiting your site.

Keyword Research for Video Results

Once you understand the importance of video in SEO, the next step is to conduct keyword research for your video marketing campaign. Keyword research is the process of finding words and phrases that people use for video search.

Why You Need Video Keyword Research

You’re probably wondering why you need separate keyword research for video marketing when you already do keyword research for text-based SEO. There are three reasons;

Videos have their own search engines: YouTube, the largest video platform, for instance, uses a search algorithm that’s different from Google’s.Key ranking factors vary: Video search engines use a completely different set of ranking factors. The elements necessary to rank in Google, for instance, won’t necessarily help you rank favorably for YouTube searches.Search intent is different: How people search for general content isn’t the same way they search for video. Even the search terms they use are usually different.

4 Steps to Successful Keyword Search for Video

There’s no one right way to do video keyword research. The most important thing is to get the basics right. Here’s what we recommend;

Brainstorm ideas

Begin by thinking about the topics you’re interested in and the type of videos you want to create. For example, if you run an Airbnb, you can have two types of videos. The first type could be marketing videos, with descriptions and video tours. Then, the second type could involve feedback from your clients.

Use autocomplete for inspiration

Once you have a few topics in mind, head over to YouTube, type in some of the words, and see which auto-suggestions come up. These suggestions are never random. They are like FAQs – the queries the search engine receives most often. Make a list of the terms for further research.

Perform competitor research 

Begin by typing your primary keyword(s) on YouTube and noting the brands on the first page of results. Then, check each brand’s video titles and descriptions. You can also use tools such as TubeBuddy to find out the tags used in each video.

Consider YouTube SEO keyword research tools

There are several YouTube SEO tools, including Kparser, Hypersuggest, KeywordKeg, YtCockpit, and WordTracker. For most of these tools, all you have to do is enter a word or phrase. The dashboard will then fill up with dozens of related keywords. Most YouTube SEO tools also give a keyword volume and competition index.

Creating a High Retention Video

Now that we’ve identified the right keywords, the next step is to create high-retention interactive videos. Follow the steps below;

Write a home-run script. You need an excellent script that grabs attention from the off and convinces the viewer to watch to the end.Choose a great character: Your main character in the video must be both likable and relatable. The little details, such as age, health, and clothing, are also vital.Find a captivating voiceover: The voice in the video can also make or break the video. So, don’t cut corners. You need the best possible voice.Keep it flowing: The viewer must not take their eyes off the video. You want them glued to the screen throughout the length of the video.Music and sound effects: Here, too, the minor details, including background score and sound effects, matter greatly.Round off with a CTA: A video marketing piece without a CTA is useless. Finish by telling the viewer to take a specific action.Prepare a video landing page: A video landing page is a page, possibly on your site, where you direct all the traffic from a particular video.

SEO Best Practices for YouTube Optimization

Once you’ve created the video, heed the following tips and best practices for maximum YouTube SEO performance.

Tailor your videos to the people looking for them

Two things are especially important. First, make sure that you know your target audience. This can be done by creating buyer personas. Secondly, use a language and tone that suits the technical level of your audience.

Add your SEO keywords to your video title and descriptions 

If the opportunity arises, you should also consider incorporating secondary and long-tail keywords in these areas. However, don’t use this as an excuse to make your titles unnecessarily long and stale.

Include a video transcript to boost SEO further

Adding a transcription of your video on the same page as the video gives search engines more information and context. The engines can’t read visual content. So, transcribed material gives them the text to crawl and rank.

Aggressively promote your videos 

Posting the video on your site or YouTube is just the first step in the marketing process. You need to get the word out by aggressively promoting the video on all your online platforms, including your website, blog, social media accounts, and even industry forums.

Conclusion

You need video SEO. Not only does it boost your potential for traffic and enhance session length, but it also increases your chances of appearing in SERP pages – where you stand to get even more traffic and leads.

Read More

The 6-Step Guide to Creating a Content Cluster that Ranks

The 6-Step Guide to Creating a Content Cluster that Ranks

The past few years have witnessed an increase in the evolution of Search Engine Optimization (SEO) content marketing. Such a positive change gives you an edge over other professional SEOs out there. 

With the exponential increase in competition for high ranking on Google, a number of people are coming up with great content. However, they are still not ranking in Google as high as they would expect. So, if you are one of those SEO professionals whose content is ranking low, you will find this guide to be extremely helpful.  

What is a Content Cluster

The only way to explain the true meaning of a content cluster is to illustrate two different sites. One must have content clusters while the other should be without content clusters. 

With no content clusters, your SEO content creation will not be as effective as you would want it to be. On the other hand, if your site has a content cluster, it will have an impeccably organized approach to SEO content creation. 

Content clusters use the pillar-topic cluster models that use an SEO strategy. This strategy is tailored towards focusing on the key topics rather than keywords. 

The content cluster targets topics while using internal linking to enhance user experience. As a result, it boosts SEO. This is contrary to what you would expect with the SEO content strategy that uses keywords.

The content cluster creates two major types of content which includes the following:

Content Pillars: This term refers to a type of content designed to give a complete answer to questions that users may have on certain topics. In most cases, you will find it in a long-form content that functions as the main go-to-content for the keyword cluster. As a matter of fact, it targets short tail keywords with high search volumes. 

Topic Cluster or Cluster Pages: It refers to a section of subtopic pages. Most of these pages consist of blog articles written to focus on related as well as longer-tail keywords. In fact, the topic cluster is more specific and detailed compared to content pillars.

Together, the two components allow you to create outstanding and high-quality content that is useful to your clients. Also, the same content makes it easy for Google to crawl. 

Basically, this information helps you to understand and explain what a content cluster really is.

How Did Content Cluster Come About?

There is no doubt that Google’s algorithm is constantly changing. This frequent transformation of algorithms keeps SEO professionals on their toes throughout. So if you are an SEO expert, you need to be aware of such changes from time to time. 

In the past few years, there has been a progressive shift in the way Google algorithms work. This drastic change has somehow affected SEO professionals in many different ways. For instance, if you are not giving your users priority, Google will put you first. 

Some years back, search engines would give you feedback based on keywords. But this is not the case at the moment. 

Initially, the keyword-focused algorithm had some weaknesses which led to bad SEO behaviors such as “black hat search engine optimization (SEO). Keyword stuffing is a perfect example of black hat SEO that you need to know. 

Keyword stuffing does not make your content relevant for your audience as such. What this statement means is that you are doing it for an algorithm which eventually results in black hat SEO. And this is where Google RankBrain comes in to save the situation.

The Emergence of Google RankBrain

Google has taken a lot of time to research its algorithm to improve SEO content. To achieve this feat, Google went ahead to introduce state-of-the-art machine learning technology known as Google RankBrain. 

This piece of technology expands the scope of how search engines such as Google evaluates and indexes pages for better ranking. However, keywords still matter despite the introduction of Google RankBrain in creating content clusters. 

Before the RankBrain, Google used the basic algorithm in a bid to determine the type of results users would get for their queries. But post-RankBrain works differently when it comes to showing the results for a given query. 

In this case, the query passes through the interpretation model first. Then the model applies some factors such as location (of the searcher), personalization and the information contained in the query. 

The model carries out this task to know the true intent of the searcher. Through discerning the true intent of the searcher, Google is able to give back more relevant results. 

In other words, if your dream is to have a site that can perform optimally in search engines, you should be in a better position to help Google provide relevant, reliable and high-quality content.

Content clusters play a critical role in connecting topically-related content on your website. Through internal linking, the content clusters are analyzed by Google BrainRank to evaluate the quality of your content during a crawl. This action helps users or searchers to get relevant results for their inquiries.

Why is it Important to Structure Your Content into Topic Clusters

Topic clusters have set a popular trend in SEO and blogging for quite some time. This is because they offer a better platform for your site to rank high in the search engine results. 

But for your site to scale such heights, it will take you a long process full of obstacles. With determination and persistence, topic clustering will help you improve your search rankings, not to mention generating organic traffic. 

In general, topic clusters work hand in hand with overarching subjects commonly referred to as pillar content. To give you a clear picture of what this means, pillar content focuses on a broad keyword while topic clusters focus mainly on specific keywords and phrases with smaller search volumes. 

This means that the more your content achieves better ranking, the more credibility it builds up in the Google search engine for the pillar content. 

Below are some of the benefits that come with pillar content:

More time is spent on your siteReduces bounce rateMore keyword mentionsIncreases social media shares and backlinksBetter Google ranking with longevity

After creating your pillar content, don’t let it remain idle and age. Instead, find some time to update it with fresh data, examples, tips and expert quotes. All these features will stay relevant to your audiences. In return, it will rank high on Google search engines.

How Do You Create Effective Content Clusters?

Now that you have an idea of content clusters, it is time to learn how you can successfully create effective content clusters. Here is a step-by-step guide for building a good content cluster strategy.

a) Choose the Topics For your Content Pillars

When deciding on the topics, you must consider the function of your content clusters. As such, you should try to ask yourself if your content will reach the target audience, convince and close important sales or convert leads. In this step, your main goal is to consider the buyer personas and their search intents.

b)Carry out Analysis Goals and Conduct Keyword Research

Before you start touching on any content, you must do a thorough analysis of what you expect after creating a content cluster. 

During the analysis, you may take into account the following points:

What you are hoping to achieve with your target SEO content strategyThe type of keywords you want to focus on more than othersThe most important conversion goals that will make your content effective

Once you have these goals at the back of your mind, you will be able to start building out your clusters.

c) Perform Content Audit

Performing an audit of your content is critical, especially if you are working with sites that have an extensive archive of cluster content. In this regard, you will have to categorize the existing blog content topic by topic. Keep in mind that these topics will become your clusters while any content about specific topics will become cluster content for those topics.

d) Establish Pillar Pages

After you have identified your cluster topics, the next task is to establish a pillar page. This page will help you connect all the content found within the cluster. Essentially, the pillar is the hub or cornerstone of a cluster. Its main function is to connect other content within the cluster. Apart from that, it has a broader scope to  interlink every element in the content

e) Implementation of the Internal Linking

Now that you have your pillar pages and clusters ready, the next task is to implement the internal linking. This process will involve the use of hyperlinks. Therefore, every content should have a link to your pillar page. The pillar page, on the other hand, must be naturally interspersed with several links to your cluster content.

f)Promote Your Content

At this point, you must know that your content is ready and the next task for you is to promote it. This step will help your content to reach as many users as possible. With time, it will rank high on Google search engines and that should be your ultimate goal.

Measure and Maintain the Success of Your Cluster Contents

There are two ways you can measure and maintain the success of your cluster contents. The first one is the creation of content gaps while the second process involves optimizing and updating your pillar content. These two methods will help you generate fresh ideas for your future content.

 Conclusion

There you have it! This comprehensive guide to creating a content structure that ranks enables you to shift your focus to this subject matter. In fact, it helps you pay attention to creating content that is based on a keyword-by-keyword structure. In the end, you will find it easier to build up better content, enhance your visibility on Google and provide the best user experience.

Read More

Largest Contentful Paint & Diagnosing Googlebot’s Render Budget

Largest Contentful Paint & Diagnosing Googlebot’s Render Budget

It was just over a year ago that Dan Leibson open sourced aggregate Lighthouse performance testing in Google Data Studio (yet another resource in the SEO industry inspired by Hamlet Batista). I’d like to share with you some observations I’ve made over the past year of running Lighthouse testing on our clients’ sites and how I think Google’s render service operates. 

 

I’ve noticed some interesting similarities between how successfully Google renders a page and the measurement thresholds in Core Web Vitals that define good scores. In this post I’ll share a few methods to investigate Google’s rendering and how I feel that relates to LCP. 

 

There are plenty of other resources by excellent SEOs if you need a general overview of Core Web Vitals. Today I’ll be talking almost entirely about LCP.

Google’s Core Web Vitals – As Reflections Of Googlebot’s Tolerances

 

Here’s a quote from Google before I really dive into the realm of “SEO Theory”. These two quotes are from a Google Webmasters support thread, where Cheney Tsai compiles a few FAQs regarding Core Web Vitals minimum thresholds for acceptable performance; this part specifically about PWAs/SPAs.

 

Q: If my site is a Progressive Web App, does it meet the recommended thresholds?

 

A: Not necessarily since it would still depend on how the Progressive Web App is implemented and how real users are experiencing the page. Core Web Vitals are complementary to shipping a good PWA; it’s important that every site, whether a PWA or not, focuses on loading experience, interactivity, and layout stability. We recommend that all PWAs follow Core Web Vitals guidelines.

 

Q: Can a site meet the recommended thresholds if it is a Single Page Application? 

 

A: Core Web Vitals measure the end-user experience of a particular web page and don’t take into account the technologies and architectures involved in delivering that experience. Layout shifts, input delays, and contentful paints are as relevant to a Single Page Application as they are to other architectures. Different architectures may result in different friction points to address and meet the thresholds. No matter what architecture you pick, what matters is the observed user experience.

 

https://support.google.com/webmasters/thread/86521401?hl=en 

(bolding is mine)

 

I think the PWA/SPA conversation is especially interesting to the concepts I’d like to discuss here as it has relevance to “static HTML response vs rendered DOM”, and how js resources impact things at the highest level of complexity; but the concepts remain true at the lowest level of complexity too.

 

When Cheney says “different architectures may result in different friction points”, this is a polite way of saying that especially with PWAs/SPAs, there are likely going to be performance problems for both users and Google caused by complex js driven experiences. This delays time to LCP and FID, or potentially obfuscates content entirely from Googlebot. But this kind of problem doesn’t exclusively apply to PWAs/SPAs…

 

If the content isn’t being painted quickly enough, Google can’t see it, and neither will the user if they back out to search results after impatience with a spinning loading wheel. Google seems to have aligned their level of “render patience” with that of a typical user’s – or less. 

 

Googlebot has a render budget, and page speed performance optimizations for user experience (Core Web Vitals) are critical to how well Googlebot’s render budget is spent. 

 

This is a good place to define how I consider render budget, which is in two ways:

 

The frequency in which Googlebot sends your URLs to its render service
How much of your pages’ assets Googlebot actually renders (more on this later)

 

Core Web Vitals help us diagnose which page templates are failing in certain technical measurements, either through field data available in Google Search Console from CrUX data (Chrome users), or through aggregate Lighthouse reporting manually created. 

Good, bad, good.

 

One approach I’ve been taking over the past year is to then try and connect the dots between those technical measurements, and resource loading failures from Googlebot measurable with other tools and techniques. 

 

What I’ve found is that these two things are often connected, and can help one diagnose the other. This method of diagnosis has also helped me uncover other render problems Googlebot is encountering that simple LCP testing in Chrome devtools/Lighthouse will not reveal.

 

Methods of Diagnosing Render Issues Related to LCP

 

There are 3 main types of render problems Googlebot is having with content that I find are related to Largest Contentful Paint, which I inspect in three different ways. 

 

Live inspection render preview shows missing assets – URL inspection in GSC
Google cache render snapshot shows missing assets – Google cache: inspection
Last crawl result shows Googlebot errors with assets – URL inspection in GSC (and server log analysis)

 

Of these, number three has been the most interesting to me and one I haven’t seen discussed elsewhere in the SEO blogosphere, so I’d be especially curious for your thoughts on that.

 

(For clarity here, a page asset is any file requested with the page (css, js, images, etc.)

 

Live inspection render preview shows missing assets

Here’s a “Tested page” screenshot in Google Search Console, using the inspection tool which sends out Googlebot Smartphone. The page is rendered and a mobile preview is returned with missing images. 

Wow, Gadsden sure looks nice to visit this time of year.

 

If URLs in a certain page template respond this way 100% of the time with live URL inspection, you can rest assured that Googlebot Smartphone is never rendering your images from this template correctly, and this is the type of broken page they are seeing. This isn’t a render budget problem, this is a “Google can’t render your content even when they try” problem (assuming you’ve ensured Googlebot is being delivered the js).

 

In the example above, all of the site’s images were delivered through js chunks that Googlebot was unable to render. I’ve encountered several sites like this, where LCP is flagged as high because users’ mobile devices take a long time to load the js before rendering images*, and Googlebot never sees the images due to its inability to render more complex frameworks. 

 

*A small note here, that LCP isn’t strictly about “how long it takes to load an image”, rather it is a measurement of how long it takes until the largest resource on the page loads, which could be anything; but is often an image.

 

       2. Google cache rendered snapshot shows missing assets

Here’s another type of scenario I’ve dealt with multiple times. Live URL inspection with the technique above sends out Googlebot Smartphone, and a clean rendered preview is returned. Images load, nothing looks broken. But when inspecting the last rendered cache snapshot from Google, I discover mixed results. Sometimes Google caches the page with no images, and sometimes it caches the page with images loading OK. 

 

Why is this? How can it be that pages in the same template, running the same code, load differently for Google sometimes? It’s really quite simple actually, Googlebot sometimes computes the rendered DOM and sometimes they don’t.

 

Why

 

Why doesn’t Google fully render every page on the internet all the time?

 

Well, in short,

“Money!” – Mr. Krabs 

Google can veil Core Web Vitals under the guise of “Think of the users!”… But they’ve been battling an endless struggle to crawl and render the javascript web, which is always evolving and growing in complexity. And sure, no one wants a slow loading site. But Google is also thinking of their pocket book. 

 

Perhaps @searchliason would disagree with this take, but from the outside looking in, it seems like if not the primary driver for the CWV update, then it at least is a convenient by-product of it. 

 

Crawling the web is expensive. Rendering it is even more so, simply because of the time and energy it takes to download and compute the data. There are more bytes to process, and js adds an extra layer of complexity. 

It reminds me of when my mom would be dismayed to find I used all of the color ink cartridge to print out 50 pages worth of video game guides, and every page has full footer, banner, and sidebar images of the gaming website’s logos.  

Image via https://web.archive.org/web/20010805014708/http://www.gamesradar.com/news/game_news_1248.html 

Imagine these sidebars printed on every page of 50 pages in color ink the year was 2000, and I was playing Star Wars Star Fighter…

 

But if I copy pasted those 50 pages into Microsoft Word first, deleted all of the color images, and printed in black and white, FAR LESS INK would be used, and mum would be less upset. The printer got the job done way faster too!

 

Google is just like mom (or a printer? I guess mom is the Google engineer in this analogy) and “painting” (rendering) a web page and all its images/resources (js/css) is the same thing as printing in color ink. The ink cartridge represents Google’s wallet.  

 

Google wants YOU to do the work, much like I had to do the work of manually removing the images before printing. Google wants you to make their life easier so that they can save money, and by becoming the leading force of Page Speed Performance, and literally defining the acronyms and measurements in Core Web Vitals, Google sets the standard. If you don’t meet that bar, then they will literally not render your site. 

 

That’s what this post is all about. If you don’t meet their LCP score (or other scores), a measurement bar they have set, then they will timeout their render service and not consider all of your content for Search eligibility.

 

Whereas view-source HTML, the static HTML, is like the black and white ink. It’s way smaller in size, quick to receive, quick to analyze, and thus CHEAPER for Google. Just because Google can sometimes crawl your rendered DOM, doesn’t mean they always will.

LCP is an acronym related to other acronyms, like CRP, DOM and TTI.

 

Google would much prefer it if you invested in creating a pre-rendered static HTML version of your site just for their bots, so that they don’t have to deal with the complexity of your js. The onus of investment is on the site owner. 

 

I’m obligated to say that Google cache isn’t a definitive analysis tool, but my point here is that if Google can cache your pages perfectly 100% of the time, you are likely delivering a simple HTML experience. 

 

When you see Google encounter inconsistent errors in caching, it likely means they are having to rely on sending your content to their render service in order to view the content correctly, and further analysis in GSC/elsewhere should be made to figure out wtf is going on, if Google can/can’t properly see your content, especially when these things are happening at scale. You don’t want to leave this stuff to chance. 

 

      3. Last crawl result shows Googlebot errors with assets

 

This is where shit gets really interesting. When I encounter the scenario presented above (sometimes Google caches resources for a certain page template correctly, sometimes they do not, yet Googlebot Smartphone ALWAYS renders the content correctly in live URL inspections), I have found a pattern of crawl error type left behind in Google’s last crawl result. 

Image taken from https://ohgm.co.uk/x-google-crawl-date/ 

 

This is a tab of Google Search Console I learned about from, in my opinion, the smartest technical SEO mind in the industry – Oliver H.G. Mason of ohgm.co.uk. It’s the “More Info” tab of URL inspections in GSC, where you can click “HTTP Response”, and see a provisional header left by Google, called “X-Google-Crawl-Date”. As you may have deducted, this is the date and time Googlebot last crawled the page. 

 

It was after reading this blog post and discovering this header that I began to pay more attention to the “More Info” tab when inspecting URLs. There are two other options in this tab: “Page Resources”, and “JavaScript console messages”. 

 

What I have found in the “Page Resources” tab, over and over again, is that Googlebot in the wild has a much lower tolerance level for asset heavy page templates than Googlebot Smartphone sent out in GSC live URL inspections. 

56 of 160 page resources weren’t loaded by Googlebot the last time they crawled this theater page – many of which were movie poster art .jpgs. But when I perform a live test with this same URL in GSC, there are only 5 to 10 page resource errors on average, mostly scripts.

 

These errors are vaguely reported as “Other error” with an XHR designation (other common possibilities are Script and Image). So WTF is an “Other error”? And why does the quantity of these errors differ so vastly between Google’s last crawl result in the wild, vs a live URL inspection in GSC?

 

The simple theory I believe is that Googlebot has a very conservative render timeout when crawling sites in order to save time and resources – which saves them money. This render timeout seems to align with the scores flagged as yellow and red in LCP. If the page takes too long to load for a user, well, that’s about the same amount of time (or less) that Googlebot is willing to wait before giving up on page assets. 

 

And that seems to be exactly what Googlebot does. As you can see from that screenshot above, Google chose to not render about ⅓ of the page’s resources, including those important for SEO: images of movie posters for the ticket showtimes! I’ve found that quite frequently, the images marked as errors here do not appear correctly in Google’s last rendered cache: snapshot of the same URLs.

These tiles are supposed to be thumbnail images for videos. Instead they are a sort of modern art block set of colored squares.

The entire <body> is almost entirely scripts, Google rendered some of the page content but not all. At least we got some colored square tiles.

 

This is not something that you should leave to chance, like all things Googlebot it’s up to you to find these issues and manually diagnose them, then find ways to manipulate Google’s behavior for an outcome that makes their job of rendering your content easier. 

 

Otherwise, you are gambling with your site’s render and crawl budgets and hoping the automated systems figure out something close to optimal. I’d rather not. How to accomplish that is a post for another day.

 

There are problems with this GSC page resource error method

 

There is noise in GSC reporting, it can’t be done at scale easily, it can be unreliable or unavailable at times, and it isn’t 100% true for all sites that these generic XHR “other errors” marked in these last crawl reports align with other LCP issues I’m trying to diagnose. But it can still be useful for my diagnosis and testing purposes. 

 

A Google representative might say “These errors are an inaccurate representation of what’s happening in our algorithm, it’s much more complex than that” and that’s all fine and well. Their point may be that when the “real” render agent (e.g., unrestricted-non-render-budgeted-agent) is sent out, like a live URL inspection does, yeah, there are no page errors. And that “Sometimes” Googlebot in the wild will open up its render budget and occasionally do the same thing.

 

But I care about what Google is doing at scale when assessing huge quantities of pages, and when Google isn’t rendering every time, or giving up on the render because the page takes too long to load, that can become a huge SEO problem. 

 

It’s the same kind of thing when a canonical attribute is only visible in the rendered DOM but not the static HTML. It really doesn’t matter if Google can see the canonical properly when relying on the rendered DOM if they don’t do that 100% of the time for 100% of your pages. You’re going to end up with canonicalization inconsistencies.

 

But how are you going to do this at scale when Google limits us to inspecting only 50 URLs per day? This is the number one reason I wish Google would remove or raise this limit, aside from access to better information on where URLs are canonicalizing elsewhere when Google ignores canonicals, as one small example… We could rant for a while on that…

Is there any hope?

 

A little, if you have access to server logs I recommend comparing differences in errors between Googlebot’s various user agents and the number of times all of your page assets respond with anything other than 200 OK per user agent type. This will sometimes get you something similar to the last crawl page resources error reporting available in GSC.

 

Another small quick task I do is to sort all verified Googlebot crawl events by their # of occurrences and filter by URLs which are canonicalized to vs from. You can generally tell fairly easily when mass amounts of URLs are having their canonicals ignored by Google. 

 

Why do any of this? 

 

While it’s true that Lighthouse reporting and Chrome devtools may help you identify some of the assets that are causing issues for users with LCP, these other techniques will help you connect the dots to how well Googlebot is technically accessing your content. Lighthouse reporting is not perfect and has failed me where other methods were successful. Sometimes only Googlebot is experiencing server response issues while your node/Chrome LH testing does not. Sometimes websites are too complex for Lighthouse to analyze correctly.

 

Sometimes the water is muddier than it may seem from automated reporting tools, with mixed behavior for various Googlebots evident.

 

What about FID? CLS?

 

This post was mostly concerned with LCP, as I mainly wanted to discuss how Google’s render service times out on resources, and how that seems to be related to LCP scoring. LCP is also the most common problem I find sites struggling the worst with and usually more obvious to fix than First Input Delay. 

 

LCP also seems to be the most sensible place to start to me, as many of the same js issues that elongate LCP are also contributing to long times to FID. There are other areas of FID to think about like the critical rendering path, code coverage waste, paring down assets by page template, and so much more… But that’s an entire post in and of itself.

 

CLS is just so obviously bad for everything and easy to detect that it isn’t really worth discussing here in detail. If you have any CLS above 0.0, it is a high priority to resolve. Here’s a good resource.

 

Conclusions

 

I believe Google spends its render budget as conservatively as possible, especially on large sprawling sites, opting a majority of the time to rely on static HTML when possible. I’m sure there are certain things that trigger Google to adjust render budget appropriately, like its own comparisons of static HTML vs rendered DOM, and your domain’s authority and demand in search results from users. 

 

Perhaps all of the other pieces of the SEO pie, like link authority and content quality earn you a higher level of render budget as well through the automated systems because your site is perceived as “high quality” enough to render frequently.

 

“Is your site big and popular enough that we should spend the money rendering it, because our users would feel our search results were lower quality if we didn’t include you?” I’d be willing to bet that Google engineers manually turn the render budget dials up for some sites, depending on their popularity.

 

If this is not you, then you might consider optimizing for a theoretical render budget – or at the very least, optimize for tangible Core Web Vitals scores.

 

To start, I recommend checking out your LCP scores and diagnosing where Google (and Chrome) might be choking on some of your more complex resources. These are a few places to begin the analysis process.

 

Create Lighthouse reporting in aggregate for all of your site’s most important templates
Investigate GSC render with live URL test for URLs that have LCP issues
Investigate Google cache snapshots for all important URLs with LCP issues
Investigate GSC last crawl result “page resources” error reporting
Compare static HTML vs rendered DOM, assess for possible areas of simplification that affect important page content

 

The post Largest Contentful Paint & Diagnosing Googlebot’s Render Budget appeared first on Local SEO Guide.

Read More

Microsoft Advertising Partner Summit announces video ads, in-browser price comparisons and Facebook import

Microsoft’s Advertising Partner Summit begins today and includes the announcement of multiple new product updates and insights. Here are some of the key announcements:

Video ads in the Microsoft Audience Network. With over $5 billion spent on video ads in 2020, Microsoft is kicking off a pilot for video ads in the US and the UK. Previously you could only bid CPC in the Microsoft Audience Network, but the pilot of video ads will allow both CPC and CPM bidding. Microsoft Advertising has also had variations of video extensions in and out of their Ads Lab program for a while now, but the video extension ads program is launching in several countries for anyone in the pilot. The clickable video image shows with a play button and then opens within the search experience to play.

Facebook import for Microsoft Audience Network. If your audience is active on Facebook and other native channels, this new import feature allows you to pull over campaign structure from Facebook into Microsoft Advertising. You’ll be able to import audience targets where they align with Microsoft sets and bring over creative like images and body copy. This comes after the popularity of Google import to help make campaign setup easy in Microsoft Advertising.

Coupons and price comparisons on Edge browser. A new pilot is launching in the U.S. on desktop of price comparisons and coupons native to the browser. This will present advertisers the opportunity to surface products to buyers looking for specific items online. “If someone is looking for an InstantPot on retailer A and you’re retailer B and you have a better price, you potentially could surface here,” said John Lee, Head of Evangelism at Microsoft Advertising.

Options for small businesses. To help businesses that don’t yet have a website but want to start advertising to Microsoft Bing searchers, the company is launching a pilot in the US called Smart Pages for small businesses. It’s automated WYSIWYG platform to help businesses easily create a site that they can then use to advertise anywhere. Unified Smart for small business also allows one entry point for small business (or anyone) to input basic info and launch omnichannel advertising (Facebook, Instagram, Twitter, and Microsoft Advertising) in a short amount of time.

Private search for Bing API. Since privacy is top of mind for many users and therefore advertisers, Bing is also announcing the launch of Private Search for Bing which allows partners like DuckDuckGo to serve ads and search results to their use without the searcher’s data being transferred. The solution is based on Azure, and ensures that Microsoft won’t receive any PII data in the search exchange.

Other reminders and announcements include the following:

Automotive adsProperty promotion adsShutterstock partnershipMultimedia adsTours and activities adsStatic headlines for DSAsMarketing with Purpose online course

Why we care. While Microsoft may have a smaller market share and audience reach than Google, it’s still working to innovate and keep up in the advertising space. These new product announcements and continued research for product improvement give advertisers and businesses more options to take advantage of the search market that Microsoft Bing does have. Plus as privacy is top of mind for both searchers and advertisers with the news of Google’s FLoC and Apple’s IDFA, it makes sense that Microsoft is ensuring that they’re part of that privacy conversation.

The post Microsoft Advertising Partner Summit announces video ads, in-browser price comparisons and Facebook import appeared first on Search Engine Land.

Read More

How to Optimize Your Website for Google’s Mobile-First Index

Google’s mobile-first index has made it all the more important to have a mobile-responsive, mobile-friendly website. There are things you can do now to make sure you are ranking well on the mobile-first index. We’re going to assume that you either have a dedicated mobile website or, better still, a responsive site. If not, you […]

The post How to Optimize Your Website for Google’s Mobile-First Index appeared first on The Daily Egg.

Read More

We now have WordPress LiteSpeed technology in our hosting platform…

We now have WordPress LiteSpeed technology in our hosting platform…

About a week ago we reached out to all of our hosting customers to inform them that we would be making a massive hosting environment change. This change was the migration from an Apache server to a LiteSpeed server. We are happy to report that the migration has been completed and what a great decision this was to benefit our customers and create a cutting edge hosting environment to further your online success.

What is LiteSpeed Web Server?

LiteSpeed Web Server is a drop-in Apache replacement and the leading high-performance, high-scalability server from LiteSpeed Technologies. You can replace your existing Apache server with LiteSpeed without changing your configuration or operating system details. As a drop-in replacement, LiteSpeed allows you to quickly eliminate Apache bottlenecks in 15 minutes with zero downtime. LiteSpeed’s expansive range of features and easy-to-use web administration console are an asset to any effective web hosting infrastructure.

Take a look at the massive performance difference that’s LiteSpeed server technology offers compared to other technologies that are more common in web hosting companies. This LiteSpeed technology enables more requests per second per user. Putting that into plain English, is that your WordPress website can manage a much larger amount of requests per second without sacrificing any speed.

WordPress’s Best Friend For Speed

Take a look at the massive performance difference that’s LiteSpeed server technology offers compared to other technologies that are more common in web hosting companies. This LiteSpeed technology enables more requests per second per user. Putting that into plain English, is that your WordPress website can manage a much larger amount of requests per second without sacrificing any speed.

So as we said earlier we were using Apache technology on our server and now we are using LiteSpeed technology. Let’s take a look at the advantages and disadvantages of both below. As you read this below you will see the stark contrast from one technology to the other.

APACHE

From the software Apache Foundation, this is the most used web server in use today. It was created by Brian Behlendorf and Rob McCool in 1995. At the time of inception, it was based on some already existing, formulated code, alongside some “hacky and clunky” software packages that enabled its operation.

For this reason, it earned the name Apache server (aPatchyServer). The dominance of Apache is not obvious but needless to say Apache is preinstalled on all main Linux distributors, hence, the strong software foothold. It makes it easier to run since it’s already installed.

Advantages of APACHE

It does have flexibility.
Wide selection of modules.
Apache is updated and maintained regularly.
Its documentation is extensive and quite useful.
It performs well and is reliable.
It can add more modules and features
Apache can create virtual hosts on one server.

Disadvantages of APACHE

Consumes more RAM under heavier load.
Spawns new processes for each request making it less efficient.
It requires debuggers since new bugs will be created while creating a personalized protocol.
Apache needs a rigid updating policy that has to be done on a regular basis without fail.
It is a process-based server.
Apache requires recognizing and disabling unwanted modules and services since leaving them on could cause serious security risks.
No formal support. Support is community-based.

LITESPEED

It is one of the most recent web servers. It is relatively new in the webserver industry but gaining popularity very fast. LiteSpeed has earned a huge following in the last few years due to its efficiency. Its streamlined architecture allows companies to serve the highest capacity trafficked websites on their servers.

LSphp is one of the major components of the LiteSpeed among WordPress or Joomla PHP applications where a majority of their customers run their websites. LSphp is a bridging process connecting PHP applications to the webserver.

LiteSpeed web server is very efficient and reliable compared to Apache, even though Apache is already relatively reliable.

Advantages of LiteSpeed Web Server

Supports most Apache modules
LiteSpeed server has an event-driven architecture, which gives it high performance.
Its enterprise version is compatible with configuration files making it possible to replace it as a web server that is much faster for shared hosting.
LiteSpeed can be administered from a graphical user interface.
It has tweaked modules in its paid version and advanced caching options for eCommerce purposes.
24/7 Support from the LiteSpeed team.

Disadvantages of LiteSpeed Web Server

LiteSpeed web server has an expensive licensing model compared to Apache.

LiteSpeed Cache vs. WP Rocket

WP rocket is one of the industry leaders for WordPress speed optimization. By no means at all would we ever say that WP Rocket is not a good plugin to install and use on your website to maximize speed. What we are saying though is that you can get better results using LiteSpeed cache. Take a look at the in-depth article at the link below showing you a side-by-side comparison when using LiteSpeed or WP Rocket on a WordPress website.

READ ARTICLE AT THE LINK BELOW:
https://blog.litespeedtech.com/2019/05/20/litespeed-cache-vs-wp-rocket

How About Some Other Caching Plugins

Let’s take a look at a few other popular WordPress caching plugins and how LiteSpeed compares. You can view these results in detail at https://www.litespeedtech.com/benchmarks/wordpress.

Real Results We Achieved Below using LiteSpeed Cache

 

BEFORE – CLICK HERE to view speed report

AFTER – CLICK HERE to view speed report

So now that you’ve read all this information, what are you supposed to do with it?

We have put together a specialized service for all of our hosting customers that will optimize their websites to take full advantage of LiteSpeed web server technology. This means that this incredible LiteSpeed technology is only accessible to customers that are on our hosting platform. You can check out the link below to read all about our hosting platform.

If you have any questions feel free to reply to this email and I will get those answered for you immediately. Take care and we look forward to making your website blazing fast with the newest technologies available.

SEE FULL DETAILS AT LINK BELOW:
https://www.wpfixit.com/wordpress-hosting-with-support

The post We now have WordPress LiteSpeed technology in our hosting platform… appeared first on WP Fix It.

Read More