SEO Articles

Google documents how to inject canonical tags using JavaScript

Google documents how to inject canonical tags using JavaScript

Google has updated its JavaScript SEO help document to add technical details on how to inject canonical link tags using JavaScript. Google added a new section titled “properly inject rel=”canonical” link tag.”

What is new. Here is the new section where Google recommends not to implement your canonical tags using JavaScript, but if you must, Google explains this is the proper way to do so. Google wrote:

While we don’t recommend using JavaScript for this, it is possible to inject a rel=”canonical” link tag with JavaScript. Google Search will pick up the injected canonical URL when rendering the page. Here is an example to inject a rel=”canonical” link tag with JavaScript:

Google added this warning stating “When using JavaScript to inject the rel=”canonical” link tag, make sure that this is the only rel=”canonical” link tag on the page. Incorrect implementations might create multiple rel=”canonical” link tag or change an existing rel=”canonical” link tag. Conflicting or multiple rel=”canonical” link tags may lead to unexpected results.”

Hit or miss. We have seen cases where Google can pick up these canonical tags or other embedded elements, even structured data, using JavaScript. But it can be hit or miss, so it is recommended that if you are going to use JavaScript specifically to inject your canonical tags, follow these directions precisely.

Why we care. Again, if you are injecting canonical tags using JavaScript, Google has finally officially documented the proper way to implement it. So check the documentation over here and make sure your implementation follows Google’s recommendations.

The post Google documents how to inject canonical tags using JavaScript appeared first on Search Engine Land.

Read More

Forecasting web traffic using Google Analytics and Facebook Prophet

Forecasting web traffic using Google Analytics and Facebook Prophet

Ready to learn a quick-and-easy way to get traffic predictions for any amount of time in the future?


This article will show you how you can:

Predict traffic changes, and maybe even let your boss know when periods of stagnation or negative growth are to be expected.What to expect during times of increased or decreased traffic, so you could tell if your declines are in line with predictions, or if something might be going wrong and traffic is declining more than it should.Include a graph with an update to your boss or client of what’s coming in advance, so they know you aren’t just making excuses after the fact.

Want to skip the info and just click a few buttons? 

While we’ll be going through running the code to forecast your web traffic and what each of the sections does, you can skip this and jump right to the Colab here if you aren’t interested in knowing what’s going on and how to make adjustments.

For those who want to run the code locally and be able to edit the hyperparameters (a fancy name for some of the variables that do important things and generally have one value for a complete run of a model) let’s go!

Important note before you begin: The further ahead you ask it to predict, the wider the gap between the low and high estimates gets as the model becomes “less sure of itself.”

How to forecast your Google Analytics traffic

We’ll be using two systems to accomplish our goal:

UA Query Explorer: In this example, we’re going to use Universal Analytics for our forecasting. I will adjust the code in the Colab in about a year to GA4, but because it needs a year or more of data to really do the job, using UA, for now, makes the most sense and few people have GA4 data going back more than a year. UA Explorer is a tool that will quickly and easily generate the API URL that will pull our analytics for us.Facebook Prophet: Prophet is a forecasting model built and open-sourced by Facebook. It includes a lot of great built-in features, such as the ability to import holidays. It’s what’ll turn our analytics data into a forecast.

For those who wish to run locally, you can obviously do so, and the code provided will get the job done.

So, let’s dive in and get you predicting your future traffic!

1. Connect your instance

What this means is you’re “turning on” Google Colab so you can run scripts from it.

2. Import the needed libraries

The next thing we need to do is to import the libraries we need to make all this work. 

They are:

pandas – a Python library for data manipulation (to help us work with time-series data structures).numpy – needed to work with arrays (like our data and sessions array).matplotlib – we’ll be using this to create some visualizations.json – used to work with JSON data.requests – used to make HTTP requests (like pulling analytics data).fbprophet – used for time series forecasting.pystan – used to update probabilities. Like the probability of the traffic being X on a date in the future.

To run is all you need to do is click the play button.

You’ll see a bunch of downloads start and the play button turn into a spinning icon indicating it’s working, and when they’re done downloading and installing the play button with re-appear.

3. Sign up for Google Analytics demos & tools

You need to log in using the Google account tied to the analytics you want to access.

4. Configure the analytics you’re pulling

Next you need to select the account, property and view you want to pull your traffic data from.

Where it notes to pick a metric, you can pick from many of your traffic metrics depending on what you want to know. Examples might be:

Sessions (the one I use most)VisitorsUnique visitorsPageviews

Additionally, when you click the “segments” field a list of all the segments for the property (including custom segments) will display so you can select what traffic you want to look at.

After you’ve run the query just copy the API request URL:

5. Import analytics into the colab

Click the play button in the next cell:

You will be asked to enter the API query you just copied:

Paste it in and hit “Enter.”

You should be presented with a graph of the traffic over the data range you selected:

6. Formatting

The next cell just changes the column headings to what Facebook Prophet expects.

7. (Optional) Save

This step is completely unnecessary if you don’t plan on referencing back to the traffic numbers or forecasted numbers. I personally find it handy, but some won’t. 

The first thing you’ll track is simply the traffic numbers (same as you could export).

I promise it gets more interesting.

8. Adding holidays

The next step is to add holidays and to determine how seasonality is considered. There are some options and ways you can tweak things, or you can run it as is.

The decisions you need to make are:

What years do you want to pull the holidays for?What country do you want to pull the holidays for?

Additionally, you’ll notice the line:

m = Prophet(interval_width=0.95, yearly_seasonality=True, weekly_seasonality=True, daily_seasonality=False, seasonality_mode = “additive”, changepoint_range = 0.85)

You can change any of the parameters to suit your needs, though these settings should work decently in most scenario:

interval_width: This is how uncertain we’re willing to let the model be. Set to 0.95 it means that when training, 95% of all points must fit within the model. Set it too low, and it follows general trends but isn’t overly accurate. Set too high and it chases too many outliers and becomes inaccurate in that direction.yearly_seasonality: Monitors and responds to yearly trends.weekly_seasonality: Monitors and responds to weekly trends.daily_seasonality: Monitors and responds to daily trends.seasonality_mode: Set to either “additive” or “multiplicative”. Additive (the default) results in the magnitude of change being constant. You’d use this in most case to deal with things like holiday traffic spikes where the percentage increase vs pre-Black Friday is more-or-less steady. Multiplicative is used in scenario where there are growing surges. For example, in a growing town that sees an additional increase each year. Not only is there growth, but that growth gets larger with each interval.changepoint_range: A change point are points where the traffic changes significantly. By default the changepoint 

This is a tip-of-the-iceberg scenario. There are other parameters you can review and apply as you feel so inspired. Details on them are available here.

I’ve set things here to what seems to work well for me in most (but not all cases). 

Yearly and monthly seasonality impact most businesses. Daily, not so much.

9. Crunch the numbers

Thankfully you don’t have to do it. 

Simply click the run button.

And you’ll soon see:

Not all the rows or columns are showing. If they were, what you’d see is:

The highest number the model predicts likely (yhat_upper).The lowest (yhat_lower).The predicted value (yhat).

Importantly, you’ll see “periods=90” in the code above. That is the number of days I’m going to get predictions for.

I’ve found 90 works decently. After that, the range gets pretty large between high and low but can be interesting to look at.

10. (Optional) Save predictions

This is an optional step for those who would like to save their predicted values, or use them to test against different parameter values (those discussed in step eight above).

Once run, you’ll just click the link:

Which takes you to:

Each time you run it your numbers and results will be stored and can be easily accessed at a future time to compare with different runs.

It will also give you the numbers to reference if you’re ever asked for a predicted value for a specific day.

11. The magic

Hit the run bottom and you get what you’ve likely come here to get.


I’ve added an extra Insights section. It simply displays the impact of some of the areas we’ve been discussing above.

You can see in the top chart, where the different change points are. Further down you get insights into how the different seasonal trends are impacting the predictions, etc.


I’ve always looked for ways to predict in advance what’s coming my way.

It’s always better to show your boss or client that a slowdown is expected a week before it happens rather than try to explain it after the fact. 

Additionally, this insight can also help you plan your strategy.

Your work may be different when in your peak traffic points, than it is when you’re in a lull. You can look back over your analytics trends month-by-month, year-by-year and try to piece it together – or just let machines do what machines do best.

Just a reminder, if you got to the bottom and wanted to get to the Colab to run this yourself, you’ll find it here.

The post Forecasting web traffic using Google Analytics and Facebook Prophet appeared first on Search Engine Land.

Read More

3 ways to dominate with Google Auction Insights and search intelligence

3 ways to dominate with Google Auction Insights and search intelligence

While marketers have overcome many challenges in recent years, sadly, the second half of 2022 is poised to be very different from the first. Unprecedented macroeconomic factors such as high inflation, food and energy costs and the war in Ukraine are impacting our business.

Hiring is starting to slow down, and the pressure is on. There is a good chance that you’ll probably be asked to do more with less, as budgets may be prioritized and cut in certain areas. 

On the flip side, Google Search ad spend continues to rise. It’s the channel that is always on, the channel with the highest conversion rate and the channel that won’t go away.

This part of the marketing mix is reliable and constant, but are the campaigns delivering success? Are they contributing to sales? Contributing to leads?

Auction Insights is a powerful tool we’ve all come to use for understanding campaign performance against competitors. Search intelligence adds another layer of granularity to ensure you’re one step ahead of your competition.  

Join Ashley Fletcher, VP of Marketing at Adthena, in his informative SMX Advanced session to explore three easy search intelligence tactics that will help you dominate your competitive landscape. He also shares use-cases from L’Oreal and Avanti West Coast trains.  

After this session, you’ll be able to save time with competitive monitoring, track performance over time and see your competitor’s spend and ad copy. The presentation will help you use data to make better ad campaign decisions and dig into search intelligence to understand why certain ads are successful to ultimately dominate the competition.

The post 3 ways to dominate with Google Auction Insights and search intelligence appeared first on Search Engine Land.

Read More

Bot traffic: What it is and why you should care about it

Bot traffic: What it is and why you should care about it

Bots have become an integral part of the digital space of today. They help us order groceries, play music on our Slack channel, and pay our colleagues back for the delicious smoothies they bought us. Bots also populate the internet to carry out the functions they’re designed for. But what does this all mean for website owners? And perhaps more importantly, what does this mean for the environment? Read on to find out what you need to know about bot traffic and why you should care about it!

What is bot traffic?

To begin, a bot is a software application created to perform automated tasks over the internet. Bots can imitate or replace the behavior of a real user. They’re very good at executing repetitive and mundane tasks. They’re also swift and efficient, which makes them a perfect choice if you need to do something on an enormous scale.

Bot traffic refers to non-human traffic to a website or app. If you own a website, you’ve likely been visited by a bot. Bot traffic accounts for more than 40% of the total internet traffic in 2022. We’ve seen this number rising in recent years, and we will continue to see this trend in the foreseeable future.

Bot traffic gets a bad name sometimes, and in many cases, they are indeed bad. But there are good and legitimate bots too. It depends on the purpose of those bots. Some bots are essential for operating digital services like search engines or personal assistants. Some bots want to brute-force their way into your website and steal sensitive information. So which are the ‘good’ bot activities and which ones are ‘bad?’ Let’s go a bit deeper into these two kinds of bots.

The ‘good’ bots

The ‘good’ bots carry out specific functions that do not cause harm to your website or server. They announce themselves and let you know what they do on your website.

The most popular bot of this type is probably search engine crawlers. Without crawlers visiting your website to discover content, search engines would have no way to serve you information when you search for something. When we talk about ‘good’ bot traffic, we’re talking about these bots. It’s perfectly normal for a site to have a small percentage of traffic coming from ‘good’ bots. Other than search engine crawlers, some other good internet bots include:

SEO crawlers: If you’re in the SEO space, you’ve probably used tools like Semrush or Ahrefs to do keyword research or gain insight into competitors. For those tools to serve you information, they also need to send out bots to crawl the web to gather data.Commercial bots: Commercial companies send these bots to crawl the web to gather information. For instance, research companies use them to monitor news on the market; ad networks need them to monitor and optimize display ads; ‘coupon’ websites gather discount codes and sales programs to serve users on their websites.Site-monitoring bots: They help you monitor your website uptime and other website metrics. They periodically check and report data such as your server status and uptime duration so you can take action when something’s wrong with your site.Feed/aggregator bots: They collect and combine newsworthy content to deliver to your website visitors or email subscribers.

The ‘bad’ bots

The ‘bad’ bots are created with malicious intentions in mind. You are probably familiar with spam bots that spam your website with non-sense comments, irrelevant backlinks, and atrocious advertisements. You’ve probably also heard of bots that take people’s spots in online raffles or those that buy out the good seats in concerts.

Because of these malicious bots, bot traffic gets a bad name. Unfortunately, a significant amount of bot traffic comes from such ‘bad’ bots. It is estimated that bad bot traffic will account for 27.7% of internet traffic in 2022. Here are some of the bots that you don’t want on your site:

Email scrapers: They harvest email addresses and send malicious emails to those contact.Comment spam bots: Spams your website with comments and links that redirect people to a malicious website. Or in many cases, they spam your website to advertise or to try to get backlinks to their sites.Scrapers bots: These bots come to your website and download everything they can find. That can include your text, images, HTML files, and even videos as well. Bot operators will then re-use your content without permission. Bots for credential stuffing or brute force attacks: These bots will try to gain access to your website to steal sensitive information. They do that by trying to log in like a real user.Botnet, zombie computers: They are networks of infected devices used to perform DDoS attacks. DDoS stands for distributed denial-of-service. During a DDoS attack, the attacker uses such a network of devices to flood a website with bot traffic. This overwhelms your web server with requests, resulting in a slow or unusable website.Inventory and ticket bots: They go to websites to buy up tickets for entertainment events or to bulk purchase newly-released products. Brokers use them to resell tickets or products at a higher price to make profits.

Why you should care about bot traffic

Now that you’ve got some knowledge about bot traffic let’s talk about why you should care about it.

For your website security and performance

We’ve discussed several types of bad bots and their functions. You do not want malicious bots lurking around your website. They will undoubtedly wreak havoc on your website performance and security.

Malicious bots disguise themselves as regular human traffic, so they might not be visible when you check your website traffic statistics. That can hurt your business decisions because you don’t have the correct data. You might see random spikes in traffic but don’t understand why. Or you might be confused as to why you receive traffic but no conversion.

Next to this, malicious bot traffic strains your web server and might sometimes overload it. These bots take up your server bandwidth with their requests, making your website slow or utterly inaccessible in case of a DDoS attack. In the meantime, you might have lost traffic and sales to other competitors.

And malicious bots are bad for your site’s security. They will try to brute force their way into your website using various username/password combinations or seek out weak entry points and report to their operators. If you have security vulnerabilities, these malicious players might even attempt to install viruses on your website and spread those to your users. And if you own an online store, you will have to manage sensitive information like credit card details that hackers would love to steal.

For the environment

Let’s come back to the question at the beginning of the post. You need to care about bot traffic because it affects the environment more than you might think.

When a bot visits your site, it makes an HTTP request to your server asking for information. Your server needs to respond to this request and returns the necessary information. Whenever this happens, your server must spend a small amount of energy to complete the request. But if you consider all the bots on the internet, then the amount of energy spent on bot traffic is enormous.

In this sense, it doesn’t matter if a good or bad bot visits your site because the process is still the same. They both use energy to perform their tasks, and they both have consequences on the environment. Even though search engines are an essential part of the internet, they are guilty of being wasteful too.

You know the basics by now, search engines send crawlers to your site to discover new content and refresh old ones. But they can visit your site too many times and not even pick up the right changes. We recommend checking your server log to see how many times crawlers and bots visit your site. A crawl stats report in Google Search Console also tells you how many times Google crawls your site. You might be surprised by some numbers there.

A small case study from Yoast

Let’s take Yoast, for instance. On a given day, Google crawlers can visit our website 10,000 times. It might seem reasonable to visit us a lot, but they only crawl 4,500 unique URLs. That means energy was used on crawling the duplicate URLs over and over. Even though we regularly publish and update our website content, we probably don’t need all those crawls. These crawls aren’t just for pages; crawlers also go through our images, CSS, JavaScript, etc.

But that’s not all. Google bots are not the only ones visiting us. There are bots from other search engines, digital services, and even bad bots. Such unnecessary bot traffic strains our website server and wastes energy that could otherwise be used for other valuable activities.

Statistic on the crawl behaviors of Google crawlers on in a day

What to do against ‘bad’ bots

You can try to detect bad bots and block them from entering your site. That will save you a lot of bandwidth and reduce strain on your server, which in turn helps save energy.

The most basic way to do this is to block an individual or an entire range of IP addresses. You should block that IP address if you identify irregular traffic from a source. This approach works, but it’s labor-intensive and time-consuming. Alternatively, you can use a bot management solution from providers like Cloudflare. These companies have an extensive database of good and bad bots. They also use AI and machine learning to detect malicious bots and block them before they can cause harm to your site.

You should install a security plugin if you’re running a WordPress website. Some of the more popular security plugins (like Sucuri Security or Wordfence) are maintained by companies that employ security researchers who monitor and patch issues. Some security plugins automatically block specific ‘bad’ bots for you. Others let you see where unusual traffic comes from and decide how to deal with that traffic.

What about the ‘good’ bots

As we mentioned earlier, the ‘good’ bots are good because they are essential and transparent in what they do. But they can consume a lot of energy while performing their tasks, which impacts the environment. Not to mention, these good bots might not even be helpful for you. Even though what they do can be considered ‘good,’ they might even bring disadvantages to your website and, ultimately, to the environment. So what can you do for the good bots?

1. Block them if they are not useful

You need to think and decide whether or not you want these ‘good’ bots to crawl your site. Do them crawling your website benefit you? And, significantly, do them crawling you benefit more than the cost to your servers, their servers, and the environment?

Let’s take search engine bots, for instance. You know that Google is not the only search engine out there. It’s most likely that crawlers from other search engines have visited you. Let’s say you check your server log and see that a search engine has crawled your site 500 times today, but it only brings you ten visitors. If that’s the case, would it be useful to let bots from that search engine crawl your site? Or should you block them because you don’t get much value from this search engine?

2. Limit the bot’s crawl rate

If they support the crawl-delay in robots.txt, you should try to limit their crawl rate, so they don’t come back once every 20 seconds and crawl the same links over and over. This is very useful for medium to large websites that crawlers often visit. But small websites also benefit from using crawl delays. Most likely, you don’t update your website content 100 times on a given day, even for larger websites. And if you have copyright bots visiting your site to check for copyright infringement, do they need to come every few hours?

You could play with the crawl rate and monitor its effect on your website. And you can assign a specific crawl delay rate for crawlers from different sources. Start with a slight delay and increase the number when you’re sure it doesn’t have negative consequences. Unfortunately, Google doesn’t support craw delay, so you don’t need to set this for Google bots.

3. Help them crawl more efficiently

You can decide which parts of your site you don’t want bots to crawl and block their access via robots.txt. This not only saves energy but also helps to optimize your crawl budget.

There are a lot of places on your website where crawlers have no business coming. That can be your internal search results, for instance. Nobody wants to see those on public search engines. Or, if you have a staging website, you probably don’t want people to find it.

Next, you can help bots crawl your site better by removing unnecessary links that your CMS and plugins automatically create. For instance, WordPress automatically creates an RSS feed for your website comments. Of course, this RSS feed has a link. But hardly anybody looks at it anyway, especially if you don’t have a lot of comments. Hence, the existence of this RSS feed might not bring you any value. It just creates another link for crawlers to crawl repeatedly, wasting energy in the process.

Optimize your website crawl with Yoast SEO

We’ve recently launched a feature in Yoast SEO Premium that lets you optimize your website to make it easier for crawlers to crawl your site. Within the crawl settings in Yoast SEO Premium, you’ll find many toggles that let you turn off various things WordPress automatically adds to your site that most sites won’t miss.

At the moment, there are 20 toggles available in the crawl setting. We’ve added a lot more options since the feature was first released in Yoast SEO Premium 18.6. It’s good to know this is currently in beta. We will be working hard to improve this feature and add more settings to help you optimize your site’s crawlability. Check out this page to learn more about our crawl feature!

The post Bot traffic: What it is and why you should care about it appeared first on Yoast.

Read More

Google Maps adds new store location feature, Locator Plus, Reserve with Google integration, new analytics and more

Google Maps adds new store location feature, Locator Plus, Reserve with Google integration, new analytics and more

The Google Maps Platform has added a bunch of new features for businesses to add deeper integration with its Google Maps Platform APIs to help those businesses streamline some of their integrations with Google Maps. These include the ability to embed Reserve with Google on your site, a new embed a store location tool, a Locator Plus feature, store locator analytics and more.

Reserve With Google Embed Feature

We have seen Reserve with Google feature in Google Search and Google Maps for a while now. Now, Google is allowing businesses to use the Reserve with Google feature on their own sites or portals.

Google added new APIs to allow you to embed the Reserve with Google on your site and your own maps. Google said this allows for “end-to-end appointment booking flow, and connects users to a variety of services.” It allows your customers who are using your store locator an option for bookings, right within the locator, which creates an easier booking process.

Here is how it works:

Store Locator Embed Feature

Google also has rolled out a way to manage and publish your store locator using the Google Maps platform. Google said this will allow you to “quickly update and roll out your store locator” on your site. In the Locator Plus solution, you can capture the location of every single store you want to show users – all within one map, Google explained. This can now be done with a simple copy and paste of some embed code and very little API development coding.

Google Locator Plus

Similar to the store locator feature, the new Locator Plus feature allows businesses to easily import business details from your Google Business Profile. This will allow the business details your already have in your Google Business Profile to be reflected in the store locator on your website. The details include include hours, contact information, photos, service options, and more.

Google Store Locator Analytics

With all these announcements, Google also is rolling out a new Google store locator analytics dashboard. This analytics dashboard should help you better understand the impact of your implementation and generate insights from your data. It shows you how well your site visitors are engaging with your store locator, Google said.

The dashboard helps you measure your performance week over week, including number of views, number of interactions with Google Search and Google Place Details, and overall engagement rate. The dashboard uses anonymized data to provide important benchmarks on how a developer’s implementation compares against other developers using the same solution.

Why we care. All these tools can be useful for large and small businesses to manage their local presence not just on Google but also on their own site. Plus, searchers are accustomed to Google and may find these embed features familiar and be more likely to use them.

At the same time, this is putting more and more of your data in Google, making you more reliant on Google for management and hosting of this features and data. So keep that in mind before implementing these on your site.

The post Google Maps adds new store location feature, Locator Plus, Reserve with Google integration, new analytics and more appeared first on Search Engine Land.

Read More

Webinar: Work smarter, not harder, to give customers what they want

Webinar: Work smarter, not harder, to give customers what they want

Personalizing your marketing campaigns for one customer is easy, but how about one hundred or thousands of customers across multiple marketing channels?

Work smarter, not harder, by using artificial intelligence (AI) as part of your martech stack and giving your customers the unique experiences they crave.

Register today for “Use Data to Create Next-Level Customer Experiences at Scale,” presented by MoEngage.

The post Webinar: Work smarter, not harder, to give customers what they want appeared first on Search Engine Land.

Read More