SEO Articles

What Google's Star Rating Rich Snippet Changes Mean for You

What Google's Star Rating Rich Snippet Changes Mean for You

On Monday this week, Google announced that they would be changing their treatment of star rating rich snippets in search results. 

We already know that Google wasn’t happy with how star rating rich snippets (like those pictured above) were being used in the wild. “Structured data penalties” have become common in the past couple of years, mostly (in my experience) for the practice of placing organisation-level ratings markup across your entire site, which Google considered inaccurate or misleading (as the ratings weren’t for the specific content of the page they were on). Lots of examples of this exact behaviour continued to exist, however, with Google seemingly unable to enforce their own rules at scale (as is common for activities requiring manual review).

So what has changed?

It looks like Google is about to get far stricter on when they will and will not respect ratings structured data, the Schema.org property which has until now triggered these enhanced search results. 

In particular, the use of a rating widget (e.g. Trustpilot, Feefo, hardcoded) to show reviews of your organization or business on a site that is run by your organization or business will no longer be respected. You can see Google’s updated guidelines here.

This covers probably the majority of real world use cases I’ve seen, although that may just be my experience.

What is Google aiming for, and what are they enforcing?

It seems that Google wants to push these ratings to apply mainly to critical or genuine 3rd party reviews – for example, a newspaper’s review of a film. Weirdly the distinction that Google is making in what they’re aiming for is whether the reviews are “self-serving” – I’d argue that having a website at all is fairly self-serving in most cases, unless you’re a charity or are running the reviews section of your site as an act of altruism.

Unfortunately, my pedantry isn’t going to get anyone their ratings back, but it does seem that for now there’s a distinction between what Google is aiming for here and what they’re actually saying they’ll enforce.

In particular, reviews of your own products on your product or category pages are handled somewhat ambiguously here – although marking up products is allowed, this could be only in the case of expert, 3rd party reviews (e.g. by a car magazine, of a car), rather than “self-serving” reviews (e.g. on the car manufacturer’s website, of one of their models). 

This tweet from John Mueller at Google suggests reviews of products you sell are fine:

It’s specific to reviews about your business, we tweaked the text in the blog post to make it a bit clearer: “Self-serving reviews aren’t allowed for LocalBusiness and Organization” (reviews for, say, products you sell are fine)

— John (@JohnMu) September 16, 2019

But in most cases these could be your own products, in which case this seems equally self-serving, but Google will probably be unable to tell the difference. At the very least, Google seems at the very least to be implying by omission that they won’t enforce this.

I’ll update this article with some real world examples once it becomes clear whether this gray area remains an opportunity for website owners – right now, results are largely as before.

EDIT (26th September): Yes, you can just re-mark otherwise compliant ratings as product mark-up, with a named product, and get your ratings back.

Will I lose my star ratings?
Vendors

If the Schema.org “type” that star ratings on your site are attached to falls into the following list, then it’s very likely:

localBusiness (if you are that business)
Organization
Service

If the type that star ratings on your site are attached to falls into the official Google list, but you are the vendor of that thing rather than a critical reviewer, then it’s somewhat likely you’ll lose your ratings, but not clear right now.

Critical Reviewers

Google has narrowed supported types, and now requires the “name” field, so possibly, yes.

You can check whether you’re using a supported type using the Google Structured Data Testing Tool, which has been updated to reflect the new guidelines, but which does not currently identify whether a review is “self serving”:

Should I care?
Vendors

If all your competitors had markup that also falls foul of this, and you do too, then the commercial impact should be net zero.

On the other hand, if you were a business large enough to have to play by the rules, and your competitors were getting away with sitewide organization review markup, this is a net win for you – and you don’t even have to do anything!

And, lastly, if you spent a ton of money on a 3rd party review platform to enable these features but your competitors didn’t, you’ll be understandably annoyed.

Critical Reviewers

Yes, you should care – this is a good way to make your results stand out in search, now more than ever.

What can I do to get my star ratings back?
Vendors

If you were previously showing reviews of yourself or your own product or service, your best bet is to hope that Google doesn’t enforce reviews marked up as product reviews, as I mentioned above. However, there doesn’t appear to be any risk of penalty – the guidelines only suggest that in a worst case you wouldn’t be eligible for star ratings:

Critical reviewers

If, on the other hand, you are a genuine third party reviewer but were reviewing a type that is no longer supported (e.g. a service), then there is no ambiguity about whether you may or may not be “self-serving”, so you just need to confirm your markup conforms with supported types.

Is this a good thing?

Generally speaking, a lot of the behaviour around how sites were being encouraged to collect and display reviews did seem disingenuous towards the user. In that sense, this may genuinely improve search results, and clarity of what a star rating means.

The other side of the argument is that Google is defining what webmasters should implement more strictly than the actual Schema.org specifications do, and they’re doing it unilaterally and without warning. This puts an onus on webmasters to make a change which can be difficult in larger organisations, and which may also wipe out the business case for previous, large investments of resource.

It would have been better, in my opinion, to make this a “from March 2020” onwards announcement, similar to the recent nofollow/sponsored/ugc link markup change.

Let me know your view in the comments below, or on Twitter!

Read More

WordPress Maintenance Tasks to Perform Regularly

WordPress Maintenance Tasks to Perform Regularly

It is so important to perform WordPress maintenance tasks! If you run your business or blog on a WordPress website, then steer away from anything that might lead to a broken site. In terms of user experience, a broken website, broken link, or a slow, outdated site is a big turn off. This will directly lead to increased bounce rates, lower search engine ranking, and dissatisfied visitors.

If you perform regular maintenance tasks, it will keep your website optimized and at its best performance all the time. Regular maintenance checks reduce the chances of a broken site. In this guide, we have compiled a list of essential WordPress maintenance tasks you can perform to keep your website up and running all the time.

Why are maintenance tasks so crucial?

A lot of beginners believe that creating a website is a one time task and a one-time investment of effort, time, and money. Unfortunately, this is far from the truth. Let us look at the components that come together to build any WordPress website.

WordPress Core CMS
WordPress Web Hosting
Plugins
Themes
Creator’s content

In what ways these components come together decides whether you have weaved yourself a great site or a lousy site. Creating a website that you have ideated in your mind is only the onset of running a successful website. Maintenance is the next important step.

All the above components must be checked regularly for any maintenance issues to ensure your site runs at its best. If you have a busy site with a lot of traffic, then you must check your website more often. It is ideal to check your site 4-5 times a year. If your website does not receive heavy loads of traffic, then checking your sire twice a year would be good enough.

1. Manually backup your complete website

Having a backup remotely stored in a location of your choice can prove life-saving. In case your website breaks or something accidentally gets deleted while you are making changes to your site, having a backup will ensure that you do not lose it completely.

When I just started with WordPress and would create websites to learn the CMS, I have had to learn this lesson the hard way. Many times I would accidentally delete an essential part of my site, sometimes I would forget to renew the hosting of a website I don’t use too often, and all of it would cost me a lot of my time, money and mostly the effort I had put in.

There are various plugins for WordPress that will automate this process for you. In case you are the forgetful kind, these tools will schedule spot-on backups for you. Some excellent plugins are:

Updraft Plus

VaultPress (Jetpack Backup)

BackWPup

BlogVault

BackupBuddy

However, I find that you must not rely on the plugins entirely. Once in a while, you must manually run the plugin to create a manual backup. Many times the plugins get outdated and do not run automatically. This might cost you your entire website. Thus, it is best to run a backup manually every once in a while.

You can also consider investing in better web hosting plans that come with the perk of a remote location backup option. Many web host providers offer robust backup solutions that will take the pressure off your mind.

2. Update your WordPress files

WordPress regularly releases updates for its CMS, which updates all of its core files that you might be running. The upgrade process will affect all files and folders included in the central WordPress installation. This includes all the core files used to run WordPress. If you have made any modifications to those files, your changes will be lost.

Whenever WordPress releases a new update, you will receive a notification on your admin dashboard. WordPress comes with its own in-built one-click install feature, which is the easiest way to update WordPress. If this doesn’t work for some reason, then you can manually update your WordPress as well.

Many times you must have come across “Briefly Unavailable” error message” while trying to open a website. This is not an error message but rather a notification page by WordPress. This happens when your update installation couldn’t complete successfully, and your website stays in maintenance mode by error.

Usually, when an update is installing, this message is briefly displayed as WordPress turn on maintenance mode for your website for a few seconds. Once the installation is complete, WordPress will automatically delete the maintenance file to disable maintenance mode. However, sometimes due to a slow hosting server response or low memory issue, the update script will timeout, thus interrupting the process. When this happens, WordPress does not get a chance to take your site out of maintenance mode.

This error can be solved very quickly by simply deleting the .maintenance file from your site’s root folder using FTP.

3. Optimize your WordPress Database

If you have been using WordPress for a while now, you must have collected a lot of junk in your database that must go as soon as possible. By the junk we mean, post revisions, spam comments, trash, transient options, orphaned metadata, and so on. This useless data will increase your database size, which in turn increases the size of your website backup. This has a direct impact on your upload time, download time, and restore time.

Cleaning up this unwanted data significantly reduces your WordPress database size, which means faster backups, easier restore, and improved database performance. You can use any plugin to make this process hassle-free. Many plugins let you set the preferences and optimize your data with a single click.

WP-Optimize is an excellent plugin that is made for this particular purpose only. The team of Updraft Plus has developed the plugin. It also lets you compress your image size. Unnecessarily large images take up a significant portion of your database. This can help you free more disk space.

4. Reset all your passwords

Your website’s login page is the primary access point to your website. Password protection is the first line of defense against unauthorized access. You must set up a unique and robust password. A strong password contains alphabets, numbers, and special characters.

It is best not to use alphabets and numbers in any particular sequential order, yet you must keep it memorable so that you can avoid writing it down. It is possible that despite maintaining a strong password, hackers might find a way to slip into the website.

Make it a practice to keep changing your passwords regularly. You can reset all your passwords once every six months. It includes the login credentials of your WordPress Admin and database, SSH and FTP accounts.

To keep a strong password, you can follow these rules:

Include numbers, capitals, special characters (@, #, *, etc.)
Ten characters – minimum; 50 characters – ideal)
Can include spaces and be a passphrase (Just don’t use the same password in multiple places)
Change passwords every 120 days, or 4 months

5. Review comments for spam

The most annoying thing to deal with is spam comments. You need to go through every idiotic comment and delete them to keep your comment section clean and relatable for your audience. Many people visit your comment section and leave spam comments to their products, pages, videos, etc. According to Akismet stats, the number of spam comments generated over one month is 6,208 times higher than the number of legitimate comments. This means WordPress gets batteredby 487 billion spammers.

If you don’t keep your posts clean from spam comments, then this will directly hurt your search engine rankings. Akismet is a powerful tool that can fend off 99% of spam comments. It will automatically keep away spam from your comment moderation queue.

Sometimes you can see that the plugin might also put away legitimate comment in the spam section. Thus it is wise to review the comments for spam manually every once in a while. However, if you receive thousands of comments, then you can use the feature of batch delete all spam comments.

6. Fix 404 errors

To explain it very simply, when a user visits a website and lands on a page that does not exist, the website shows an error, 404 error not found. The most common reaction your audiences will have is that they will simply leave the website and look for an alternate.

The 404 error drove away a potential customer from your website, but why does this error occur? Most commonly, it happens when the visitor either types an incorrect URL or lands on a page that is currently not created on your website but has some rudimentary link that leads to it. It is also possible that there might have been a page that no longer exists but has its dead link still floating around the web.

You can tackle this by either displaying a personalized message to the visitor or redirecting them to a relevant page on your website. An effortless way to set a redirect link to these missing posts and pages is by using a plugin. You can use Redirection, which is a popular redirect manager for WordPress. This free plugin creates and manages redirects quickly and easily without needing Apache or Nginx knowledge. If your WordPress supports permalinks, then you can use Redirection to redirect any URL.

7. Inspect all the WordPress forms

Many website owners overlook the importance of contact forms and other forms they put on the site to interact with the audience. Many times the forms are not working correctly and sending responses to the configured email. This should not delude you to believe that you have low engagement via forms. Maybe something is not right, and you simply need to fix it.

You need to test your forms and send a message manually to check if it is getting delivered on the configured email address. If the message is not getting delivered to the mail, then you need to reconfigure the email of your web hosting with the forms.

You can also optimize your forms to better performance in terms of rejecting spam and not letting users send incorrect data via forms. This will hugely benefit you by filtering the responses you get. You can configure your forms with data validation and a human check field like captcha codes, reCaptcha, etc.

8. Fix all broken links

If you have had your website on WordPress for long enough, you might observe that some outbound links to other websites, pages, and posts that you put on your website no longer work. This might be because those other websites and pages do not exist anymore or might have moved to a different address altogether.

You might also accidentally add broken images, poorly formatted links, or misspell your own links. These broken links usually display a not found error, which creates an annoying user experience. It increases the bounce rate significantly, ultimately affecting your search engine ratings.

Make fixing all broken links a part of your maintenance routine. You can use SEMRush, Ahrefs, and Google Search Console to find broken links and mend them.

9. Optimize your images for WordPress

Images make up for most of the weight of your site. Usually, people upload pictures without optimizing it according to their blog dimensions and size. On display, the image will be reduced to the dimensions of your blog, but it will still take more space in the database. Thus, it’s essential to optimize your images before uploading them to your website.

This will help you shed excess weight and increase your page load speed. Slow loading images are repelling to visitors as well. WP-Optimize plugin lets you reduce the image size and optimize them for WordPress.

10. Run performance tests regularly

Many developers run performance tests only when they have just created the website. As time goes by and other activities take more space, website owners rarely run performance tests on their sites.

Make this a part of your maintenance routine to check your website for performance issues. We have repeatedly informed that faster loading websites are not only useful for user experience but also boost your search engine rankings. Thus, make sure that you run performance tests every once in a while. Do not just check performance for your homepage but also for pages that receive the most traffic or receive the most engagement.

Conclusion on WordPress Maintenance Tasks

We hope this article helped you gain insight into why it is crucial to perform regular maintenance tasks for your WordPress website. A site that performs better, is faster to load, and doesn’t show any broken links is essential to offer a satisfying user experience. If your website is performing optimally, then it will boost your search engine rankings too.

Create a maintenance routine for your website and follow these steps to run your site at the best performance.

The post WordPress Maintenance Tasks to Perform Regularly appeared first on WP Fix It.

Read More

Apple Search

Apple Search

Google, Google, Google

For well over a decade Google has dominated search to where most stories in the search sphere were about Google or something on the periphery.

In 2019 Google generated $134.81 billion in ad revenues.

When Verizon bought core Yahoo three years ago the final purchase price was $4.48 billion. That amount was to own their finance vertical, news vertical, web portal, homepage, email & web search. It also included a variety of other services like Tumblr.

Part of what keeps Google so dominant in search is their brand awareness. That is also augmented by distribution as defaults in Chrome and Android. Then when it comes to buying search distribution from other players like Mozilla Firefox, Opera or Apple’s Safari they can outbid everyone else as they are much better at monetizing tier 2 markets and emerging markets than other search companies are since they have such strong ad depth. Even if Bing gave a 100% revshare to Apple they still could not compete with Google in most markets in terms of search monetization.

Apple as a Huge Search Traffic Driver

In 2019 Google paid just under £1.2 billion in default payments for UK search traffic. Most of that went to Apple. Historically when Google broke out their search revenues by region typically the US was around 45% to 46% of search ad revenue & the UK was around 11% to 12%, so it is likely Google is spending north of $10 billion a year to be the default search provider on Apple devices:

Apple submitted that search engines do not pay Apple for the right to be set as the primary default search engine on its devices. However, our assessment is that Google does pay to be the primary default on Apple devices. The agreement between Google and Apple states that Google will be the default web search provider and the same agreement states that Google will pay Apple a specified share of search advertising revenues. We also note that Google does not pay compensation to any partners that set Google Search as a secondary option. This further suggests that Google’s payment to Apple is in return for Apple setting Google as the primary default.

Apple is glad to cash those checks & let Google handle the core algorithmic search function in the web browser, but Apple also auto-completes many searches from within the address bar via various features like website history, top hit, news, Siri suggested website, suggested sites, etc.

A Unique Voice in Search

The nice thing about Apple powering some of those search auto-complete results themselves is their results are not simply a re-hash of the Google search results so they can add a unique voice to the search marketplace where if your site isn’t doing as well in Google it could still be promoted by Apple based on other factors.

High-traffic Shortcuts

Apple users generally have plenty of disposable personal income and a tendency to dispose of much of it, so if you are an Android user it is probably worth having an Apple device to see what they are recommending for core terms in your client’s markets. If you want to see recommendations for a particular country you may need to have a specialized router targeted to that country or use a web proxy or VPN.

Most users likely conduct full search queries and click through to listings from the Google search result page, but over time the search autocomplete feature that recommends previously viewed websites and other sites likely picks up incremental share of voice.

A friend of mine from the UK runs a local site and the following shows how the Apple ecosystem drove nearly 2/3 of his website traffic.

His website is only a couple years old, so it doesn’t get a ton of traffic from other sources yet. As of now his site does not have great Google rankings, but even if it did the boost by the Apple recommendations still provides a tailwind of free distribution and awareness (for however long it lasts).

For topics covered in news or repeat navigational searches Apple likely sends a lot of direct visits via their URL auto-completion features, but they do not use the feature broadly into the tail of search across other verticals, so it is a limited set of searches that ultimately benefit from the shortcuts.

Apple Search Ranking Factors

Apple recently updated their search page offering information about Applebot:

Apple Search may take the following into account when ranking web search results:

Aggregated user engagement with search results
Relevancy and matching of search terms to webpage topics and content
Number and quality of links from other pages on the web
User location based signals (approximate data)
Webpage design characteristics

Search results may use the above factors with no (pre-determined) importance of ranking. Users of Search are subject to the privacy policy in Siri Suggestions, Search & Privacy.

I have seen some country-code TLDs do well in their local markets in spite of not necessarily being associated with large brands. Sites which do not rank well in Google can still end up in the mix provided the user experience is clean, the site is useful and it is easy for Apple to associate the site with a related keyword.

Panda-like Quality Updates

Markets like news change every day as the news changes, but I think Apple also does some Panda-like updates roughly quarterly where they do a broad refresh of what they recommend generally. As part of those updates sites which were once recommended can end up seeing the recommendation go away (especially if user experience declined since the initial recommendation via an ad heavy layout or similar) while other sites that have good engagement metrics get recommended on related searches.

A friend had a website they sort of forgot that was recommended by Apple. That site saw a big jump on July 9, 2018 then it slid back in early August that year, likely after the testing data showed it wasn’t as good as some other site Apple recommended. They noticed the spike in traffic & improved the site a bit. In early October it was widely recommended once again. That lasted until May of 2019 when it fell off a cliff once more. They had monetized the site with a somewhat spammy ad network & the recommendation mostly went away.

The recommendations happen as the person types and they may be different for searches where there is a space between keywords and the word is ran together. It is also worth noting Apple will typically recommend the www. version of a site over the m. version of a site for sites that offer both, so it makes sense to ensure if you used separate URLs that the www version also uses a responsive website design.

Indirect Impact on Google

While the Apple search shortcuts bypass Google search & thus do not create direct user signals to impact Google search, people who own an iPhone then search on a Windows computer at work or a Windows laptop at home might remember the site they liked from their iPhone and search for it once more, giving the site some awareness that could indirectly bleed over into impacting Google’s search rankings.

Apple could also eventually roll out their own fully featured search engine.

Categories: other search engines

Read More

Crawl Budget Optimisation Through Log File Analysis

Crawl Budget Optimisation Through Log File Analysis

Log file analysis is one of those tasks you might not do often – due to data availability & time constraints – but that can provide insights you wouldn’t be able to discover otherwise, particularly for large sites. If you’ve never done a log analysis or are unsure what exactly to look for and where to start, I’ve built a guideline to help you:

Get started with some log file analysis tools
Understand what log files are useful for
Digging into the data and think how to better redistribute crawling resources

Log files are essentially a diary of all requests made to your site for a specific time period. The data is very specific and more in depth than you could gather from a crawl, Google Analytics and Google Search Console combined. By analysing this data you can quantify the size of any potential issue you discover and make better decisions on what to dig into even further. You can also discover issues things such as weird crawlers behavior which you could not identify through a regular tech audit. Log analysis is particularly valuable for large sites where a crawl would require an extensive amount of time and resources. 

Log file analysis tools

There are different tools out there for this task, Screaming FrogBotify and BigQuery to mention a few. At Distilled, we use BigQuery, which is quite flexible. A great place to get started if you’re not familiar with log analysis is the guideline Dom Woodman, senior consultant at Distilled, wrote on what a log file analysis is and how to do.

Regardless of the tool you choose to use, you should be able to use the framework below.

Understand what log files are useful for

Log files are a really good source for:

Discovering potential problems: use them to find things you can’t with a crawl since that doesn’t include historical memory of Google
Identify what to prioritise: knowing how often Google visits URLs can be a useful way of prioritising things.

The best part about log files is that they include all kinds of information you might want to know about, and more. Page response code? They have it. Page file type? Included. Crawler type? Should be in there. You get the idea. But until you slice your data in meaningful ways you won’t know what all this information is useful for. 

Digging into the data

When you begin to analyse logs, you should slice the information in big chunks to obtain a good overall picture of the data because it helps to understand what to prioritise.  You should always compare results to the number of organic sessions obtained because it helps to establish if crawling budget should be distributed differently.

These are the criteria I use to dig into the log file:

Top 10 URLs/paths most requested
200-code vs. non 200-code page
URLs with parameters vs non parameters
File type requests
Requests per subdomain 

Before you begin

At this stage, you should also decide on a threshold for what represents a significant percentage of your data. For example, if you discover that there are 20,000 requests with a 301 response code and the total number of requests on the logs are 2,000,000, then knowing that the 301s are only 1% of total requests helps you bucket this as a low priority issue. This might change by type, for example, 10% of category pages with a 404 status code might be more important than 10% of product pages with a 404 code.

Once you begin obtaining results from your data, you should consider whether the current crawler behavior is the best use of crawling resources. The answer to this question will tell you what the following actions should be.

Top 10 URLs/paths most requested vs organic sessions they drive

Through log file analysis you’ll often discover a few paths or specific URLs that had a significantly higher amount of requests compared to the rest. These usually happen to be URLs linked from most templates, for example from main nav or footer,  or from external sources but don’t often drive a high number of organic sessions.

Depending on what type of URLs these are, you may or may not need to take action. For example, if 40% of resources are used to request a specific URL, is that the best use of crawling resources or could they be better distributed?

Below is an example of the breakdown of top requested paths from log analysis and how they compare to organic sessions they drive:

Making this comparison on a graph allows you to easily identify how crawling resources could be better distributed. The first two blue bars show that the majority of requests are to two specific paths which drive no organic sessions. This is a quick way to identify important wins right away: in the example above, the next step would be to understand what those URLs are and where they are found to then decide whether they should be crawled and indexed or what additional action may be required. A tech audit would not give you the information I show in the graph. 

Page response code

Based on whether a high percentage of the log requests is a non-200 code page, you may want to dig into this area further. Here you should query your data to discover what is the break down of non-200 code page and based on results dig further, prioritising those with the highest percentage. 

Below is an example of non-200 code pages breakdown:

As visible above, almost 50% of all requests are to a non-200 status code page. In this case, investigate further into each status code to discover which type of pages they come from and what percentage each represents. As a side note, if you also encounter a large number of pages with a 304 status code, this is a server response essentially equivalent to a 200-status code. The 304 response indicates that the page has not changed since the previous transmission.

Here are some common checks you should do on non-200 code pages:

Are there patterns of internal links pointing to these pages? A crawl of the site would be able to answer this.
Is there a high number of external links/domains pointing to these pages?
Are any of these pages’ status code caused by certain actions/situations? (i.e. on ecommerce sites, discontinued products may become 404 pages or 301 redirects to main categories) 
Does the number of pages with a specific status code change over time?

URLs with parameters vs non-parameters

URLs with parameters can cause page duplication, in fact very often they are just a copy of the page without parameters, creating a large number of URLs that add no value to the site. In an ideal world, all URLs discovered by crawlers do not include parameters. However, this is not usually the case and a good amount of crawling resources are used to crawl parameterised  URLs. You should always check what percentage of total requests parameterised URLs make up for. 

Once you know the size of the issue, here are a few things to consider:

What is the page response code of these URLs?
How are parameterised URLs being discovered by crawlers?
Are there internal links to parameterised URLs?
What parameter keys are the most found and what are their purpose?

Depending on what you discover in this phase, there may be actions related to previous steps that apply here. 

File type requests

I always check the file type breakdown to quickly discover whether requests to resources such as images or JavaScript files make up a big portion. This should not be the case and in an ideal scenario as the highest percentage of requests should be for HTML type of pages because these are the pages Google not only understands but are also the pages you want to rank well. If you discover that crawlers are spending considerable resources for non-HTML files, then this is an area to dig into further. 

Here are a few important things to investigate:

Where are the resources discovered/linked from?
Do they need to be crawled or should they just be used to load the content?

As usual, you should bear in mind the most important question: is this the best use of crawling resources? If not, then consider blocking crawlers from accessing these resources with an indexing purpose. This can be easily done by blocking them on robots.txt, however, before you do you should always check with your dev.

Requests per subdomain

You may not need this step if you don’t have any subdomains, but otherwise, this is a check you should do to discover unusual behavior. Particularly, if you are analysing the logs of a specific domain, requests to other domains should be somewhat limited, depending on how your internal linking is organised. It also depends if Google sees the subdomains as your site rather than a separate subdomain.

As with the previous steps, this is the first breakdown of your data and based on the results it should tell you whether anything is worth digging further into or not. 

A few things to keep in mind in this section:

Should crawler spend less/more time on subdomains?
Where are the subdomain pages discovered within your site?

This could be another opportunity for redistributing crawling budget to the pages you want crawlers to discover.

To wrap it up

As with many SEO tasks, there are many different ways to go about a log analysis. The guideline I shared is meant to provide you with an organised method that helps you think about crawling budget resources and how to better use them. If you have any advice on how you think about crawling budget resources, please leave your advice in a comment below. 

Read More

How to duplicate a post in WordPress, plus 4 reasons why!

How to duplicate a post in WordPress, plus 4 reasons why!

If you write content in WordPress, duplicating a post can come in quite handy. It can save you a lot of valuable time to clone a post and adjust the content, instead of starting from scratch with every post you write. Fortunately, cloning a post becomes very easy with the Yoast Duplicate post plugin. In this article, you can read how to use it and we’ll discuss 4 everyday situations in which you might want to use it.

How to duplicate a post in WordPress

One of the newest additions to our Yoast stable is the Yoast Duplicate Post plugin. This simple but effective plugin helps your duplicate or clone a post in a few simple steps:

Install the Yoast Duplicate Post plugin

If you don’t have the plugin yet, simply go to Plugins in the backend of your WordPress site and install Yoast Duplicate Post. Not sure how to do this? This article explains how to install a plugin.

Click on Posts

After you’ve installed and activated the Yoast Duplicate Post plugin, you’re good to go. When you want to duplicate a post, go to your post overview where you’ll see all your posts listed. Find the post you want to clone:

Hover over the post you’d like to clone

If you hover your mouse over the post you’d like to clone, you’ll see some options appear under the post title:

Click on Clone post

When you want to duplicate your post, simply click “Clone” or “New Draft”. Both functions will clone your post. If you click on “New Draft” the clone will open directly so you can start working in it immediately. If you click “Clone” a duplicate of your post will appear as a draft post in the list:

Rename the clone

To prevent confusion, it’s best to rename your duplicate post right away. You can do this by clicking on the post and editing the title there. Or you can click on “Quick Edit” in the post overview and edit the title in this input field:

4 reasons to duplicate or clone a post

There are several reasons why you’d want to create a clone of an existing post. There might be more than 4 reasons, but here we’d like to highlight the reasons that are recognizable for most of us. Of course, you don’t want to publish the exact same or very similar content as that might confuse search engines. So, in what situations should you to use it?

1. Extensive updates on existing posts and pages

Keeping your content fresh and up to date is a sensible thing to do. You don’t want to show visitors outdated or incorrect information. Also, search engines prefer to serve users content that is regularly updated and accurate. Sometimes, updating is just a matter of changing a sentence here and there or fixing a typo, which you can easily do in an existing post. But if it needs more work, for instance, a complete rewrite of multiple paragraphs, you might want to work on this in a clone.

Working in a clone has a couple of advantages:

it allows you to adjust what’s needed, save it, re-read it, and correct it if necessary before your changes go live;you can preview your post and see exactly what it looks like; you can share the preview with others before you publish the changes.

When you’re sure the post is ready for publication, copy the content of the clone into the existing post and hit update. That way you’ll keep the old URL. If you do want to publish the clone instead, make sure to delete the old post and create a redirect!

2. Scheduled updates

In some cases, you don’t want to publish changes right away. You’ll have to wait until, for instance, a product is launched or an event took place. If you have a cloned post to work in, you can perfectly prepare the changes and just copy the content or push the post live (don’t forget to redirect!) when the time is there. This will save you a lot of last-minute work and editing.

3. Merges of multiple posts

Large sites often have lots of content. Inevitably, the content you publish might become more alike over time. We notice this ourselves as we write a lot about content optimization. Before you know it, you’ll have multiple posts on how to optimize a blog post. Is this a bad thing? Well, it might be, if you start competing with yourself in the search engines. We call this keyword cannibalization. We have a complete article on how to find and fix cannibilization in a smart way. 

If you have posts that are very similar and compete for a top ranking in the search results, you’re better off merging them into one complete and high-quality post. In order to do so, you can check how these similar posts are doing, which one gets the most traffic and ranks highest. This is the preferred post or URL to keep. 

When you take a closer look at the other post you might find interesting stuff in there that your high-performing post is missing. Then, of course, add it! This might be quite a puzzle though, and that’s where duplicating your post comes in. If you create a clone, you can take a good look at both posts, take the best out of both of them and merge them into one awesome and complete post. When you’re done, copy the content from the clone into the best-performing URL and don’t forget to redirect the post you’re not keeping! 

4. Reusing a format

Especially in eCommerce, you might have a certain format for a product page. But also, for a series of posts on your blog, help pages on your site, or events, you might like to stick to a certain format. If you’re using a format you’re happy with, you can use the clone function to duplicate the page with the right format. Delete the content you shouldn’t keep and just fill the post with the content about other products, help info, or events. It’s as easy as that, and a huge time saver.

What do you use it for?

Do you already use the Yoast Duplicate Post plugin? We’d like to know what situations you use it in! As we’re continuously improving the plugin, we love to hear how you use it and what features could be useful to add or improve. So please share your thoughts here!

Read more: about the Yoast Duplicate Post plugin »

The post How to duplicate a post in WordPress, plus 4 reasons why! appeared first on Yoast.

Read More

Update to Aggregate Lighthouse Reporter

Update to Aggregate Lighthouse Reporter

Hey everyone!

It’s been a hot minute since I posted so just wanted to quickly come in here and share an update we made to a free tool.

A while back we released a tool that would allow you to aggregate Lighthouse reports by template and visualize and report on assets across sites and at the template/page level in Google Data Studio. Sounds pretty cool right? You can read about it here and check out the GitHub repo here.

Shortly after that Google released their Core Web Vitals and included them in a release of Lighthouse. So here we are. We have updated our repo to include Lighthouse 6.0 (this is a copy of our production repo so it will automatically update.)

That means you can get all these beautiful visualizations of Core Web Vitals and a few other new things:


You can check out an example report to play around with here (sorry, not sorry Waste Management). And in case you missed it the first time the GitHub repo is at the big button below

Check it out on GitHub

The post Update to Aggregate Lighthouse Reporter appeared first on Local SEO Guide.

Read More