Blog

How to migrate to Google Analytics 4: A step-by-step guide

Posted by on Jun 30, 2022 in SEO Articles | Comments Off on How to migrate to Google Analytics 4: A step-by-step guide

You’ve likely heard by now that Google is updating Google Analytics from Universal Analytics (also known as GA3 or UA) to a new, upgraded version, Google Analytics 4 (GA4). 

The migration for many of us from GA2 (Classic Analytics) to GA3 (Universal Analytics) was relatively painless 10 years ago.

This migration isn’t quite as simple. 

There are many differences between the current Google Analytics you’re likely using (UA) and the new version of Google Analytics (GA4), and not all of the features of UA are present in GA4. 

Additionally, Google is pressing us to update now. As of July 1, 2023, the free version of Google UA will no longer collect data. 

That means that you need to seriously address your analytics plan as soon as possible to ensure that as of that date that your new GA4 property is tracking correctly and can provide you with accurate year-over-year data.

Here’s how to migrate to Google Analytics 4.

Phase 1: Create your GA4 property and launch it

Of utmost importance is creating your new GA4 properties and launching them immediately. 

Properties won’t be importing historical data from UA, which means that your GA4 property will only start tracking traffic data from the moment you create it and forward. 

So the sooner you create it, the sooner you’ll have data populating in the GA4 property.

Timeline: Ideally, this should be done before July 1, 2022. But if you miss this date goal, just create your GA4 property (or properties) as soon as possible.

To launch the new property, you’ll need to:

Create the new GA4 property.Add the new GA4 tracking tag to your site. 

This is most easily accomplished if you use Google Tag Manager. After deployment, check the new property over the next few days to ensure you’re seeing traffic data populating in the property.

Phase 2: Make a list of your key items

New analytics properties do not inherit specific tracking items (e.g., goals, events) from any other properties (including UA properties). 

The following is a list of the most common tracking items I use in Google Analytics. You may have additional ones to add, but these are some common ones you need to add to the list:

EventsGoals (Conversions)Content GroupingsCustom Dimensions/MetricsReferral ExclusionsProduct Link ConnectionsAudiences

Once you’ve created your list, evaluate what you need to keep, which you can discard, and also perhaps where gaps might exist where you may want to create new tracking items, such as new events, new goals, etc.

Remember that goals are created in each reporting view. Reporting views are not used with GA4, so if you want to preserve all of the goals you currently have in multiple reporting views for the same UA property, then you’ll need to list all of them and recreate them in the GA4 property. 

Like UA reporting views, which limited you to 20 goals per reporting view, GA4 limits you to 30 conversions per property

When you list out your current goals, be sure to note which ones are “non-event” goals (for example, destination-based goals), as you’ll need to make some changes to how you track those going forward.

Phase 3: Begin migrating individual items to GA4

Once you have your list of items to recreate in GA4, the real setup work begins!

Here are the most common items for set up and some tips for setting each one up:

Events

Events in GA4 are similar to UA setup, but you may need to set the tagging up anew for GA4 goals.

Some events you may have manually set up in the past, like scroll depth, are now automatically added for you in GA4.

So first, check the automated goals that are tracking in your GA4 property by looking at the events under Configure in the navigation. No need to recreate events that Google has already created for you!

Like with adding the general GA tracking code to your site, Google Tag Manager is the easiest tool to use for this effort.

Goals (Conversions)

In GA4, goals are now renamed “Conversions”, and all goals are event-based.

When migrating your existing UA goals to GA4, I suggest starting with the event-based goals, as those are more similar to the original goal set up in UA.

Once you’ve set up the events in GA4 and marked them as conversions, start with destination-based goals and engagement goals.

For goals that were previously destination-based, you can either add the goal to GA4 via the interface or via code. For goals that were previously engagement-based, you’ll first need to create a GA4 audience (see below) and then recreate the engagement-based goals utilizing that audience.

Content Groupings

In UA, content groupings were created in the interface itself. However, in GA4 there is no interface setup – all content groups are created through page tagging.

In some ways, this is a nice change, but it requires a lot of time investment at the onset.

A page can have multiple “gtags” on it, and the simplest way to implement these will likely be Google Tag Manager.

If you wish to implement content groupings in GA4, visit this reference guide from Google.

Custom Dimensions/Metrics

Like with UA, setting up custom dimensions and metrics is a two-step process – it requires set up in both the interface and the code.

Your existing UA custom dimensions and metrics tags may migrate over fine to GA4, but you will still need to set up the dimensions and metrics in the GA4 property interface.

To set up custom dimensions and metrics in the interface, refer to Google’s setup guide.

Referral Exclusions

Referral exclusions still exist in GA4, but they’ve essentially been renamed and moved a few layers down from the top admin navigation levels.

To add referral exclusions, under your GA4 property admin menu, select Data Streams, then your site data stream (your URL), then select More Tagging Settings under the Additional Settings section.

Finally, click Configure Your Domains and enter your domain and any other domains (such as those from third-party apps that integrate with your website, like certain marketing automation tools).

Product Link Extensions

You’ll need to reconnect your Google products’ links to your new GA4 property. Note that it’s OK to have your Google properties connected to multiple GA properties, so you don’t need to remove your existing UA product links to connect GA4 too.

Product Links now appear at the top level of the property admin navigation. Select each of the Google products you use, like Google Ads, and connect your new GA4 property(ies).

Audiences

Google Analytics audiences are helpful for advertising purposes and now also conversion setup in GA4. It’s important to set up your audiences long before July 1, 2023 so that you can update your Google Ads campaigns with comparable, viable audience lists when the UA properties stop tracking. 

To recreate your audiences in GA4, first focus on the audiences in your list in UA (at the property level) and look for those that have Google Analytics as the audience type. Those will need to be recreated in GA4.

However, the terminology and way you create audiences has changed in GA4, so refer to Google’s audience creation guide for assistance.

Ecommerce

Like almost all things in the UA to GA4 migration, ecommerce tracking also won’t magically move from UA to GA4. Google recommends creating a separate set of tags for GA4 ecommerce tracking, even though it is the same as UA.

Here again, Google Tag Manager is likely the easiest and fastest way to implement your ecommerce tagging across the site.

For detailed information for ecommerce migration, visit Google’s GA4 ecommerce migration guide.

Timeline: Because these items will only start tracking when they are created, ideally the tracking items above should be implemented before July 1, 2022. However, if you can’t complete them all before July 1, 2022, just complete them as quickly as you can.

Phase 4: Check your items

Once you’ve launched your tracking items in the new GA4 properties, you’ll need to double-check that they are tracking properly. 

Evaluate your ecommerce, conversions, event tracking and more to ensure they are tracking as expected in the new properties. If not, troubleshoot the issue and fix it as soon as you can.

Phase 5: Determine a date for migrating to GA4 as your single source of truth

Organizations rely on Google Analytics for reporting for many departments, so it’s important that the organization agree to when the new GA4 property(ies) will become the “single source of truth” for data and reporting. 

In best practice, you should likely wait until you have year-over-year data in your GA4 property(ies) prior to changing your single source of truth to GA4 in part because the metrics and tracking in GA4 are completely different than they are in UA, ergo you cannot accurately use UA data from one year and compare to GA4 data in another year.

If you can get your new GA4 implemented prior to July 1, 2022, then you can likely start using it as your single source of truth as of July 1, 2023.

Regardless, if you use the free version of Google Analytics, you’ll be forced to migrate to GA4 as your primary source of truth on July 1, 2023, even if the year-over-year data with UA isn’t comparable.

Phase 6: Archive your UA data

To add insult to injury, Google decided that in addition to forcing us all to migrate to GA4 now that they will also delete all of our historical UA data beginning on January 1, 2024

While you do have a bit more time to archive this data, you should plan on archiving in case you need to reference it in the future. 

First, determine what data you regularly need. For example, I often use the source/medium report. 

Then consider the intervals in which you access this data. Typically, I access data on a monthly basis, such as June 1-30. You’ll want to archive your data in a manner that matches these data usage habits.

I personally find the UA interface clunky for archiving purposes. In my example of the source/medium report and pulling monthly data, in the interface, you can only pull two months of data at a time (one as the original month and one as the comparison month), then download the data to CSV. That will take forever! 

Instead, especially if you’re not a developer who knows how to use the Google Analytics API, consider using the Google Analytics Spreadsheet Add-On, which works with Google Sheets. It’s super handy and pulls that data fast! 

Just be sure you don’t run into data sampling issues, and if you do, take smaller reports. 

For example, if I pull 10 years of data from the source/medium report broken down by month, it may be so much data that it forces Google to sample the data. If that’s the case, I would try breaking it down into several report pulls, perhaps one year’s worth of data per report. You can always combine the data into one sheet once it’s pulled.

Timeline: If you are using the free version of UA, you will need to do this between July 1-December 31, 2022. Your data will be deleted on January 1, 2023. If you are using UA 360, you must archive your data before June 30, 2023.

Finally, don’t panic!

I know, it’s all stressful. Hang in there. It’s going to be OK. 

I speak on this subject around the country, and recently someone asked me if there’s anything good about GA4.

The answer is a resounding yes! 

GA4 is aiming to get us all closer to true ROI and cross-device reporting. 

However, growth and change are difficult. We humans don’t tend to enjoy it. 

But it truly will all be OK. Just prioritize this now, and if you need my help, please reach out. Data is my passion, and I want yours to be accurate! You can reach me at [email protected]

What you need to know about adopting Google Analytics 4

Want to watch my SMX Advanced session, which explored the differences between GA4 and Universal Google Analytics? You can watch it on-demand.

Not registered for SMX Advanced? Get your free pass here.Already registered for SMX Advanced? Log in here.

The post How to migrate to Google Analytics 4: A step-by-step guide appeared first on Search Engine Land.

11 Google Sheets formulas SEOs should know

Posted by on Jun 30, 2022 in SEO Articles | Comments Off on 11 Google Sheets formulas SEOs should know

11 Google Sheets formulas SEOs should know

Sometimes the best SEO tools are free.

Look no further than Google Sheets.

While it’s not great at plotting ranking data (inverting the y-axis is always ugly), there are numerous ways to use Google Sheets for SEO.

Here are 11 of the formulas and tips I find myself using for SEO on an almost daily basis – for keyword management, internationalization, content/URL management and dashboards.

Get the daily newsletter search marketers rely on.

Processing…Please wait.

SUBSCRIBE

See terms.

function getCookie(cname) {
let name = cname + “=”;
let decodedCookie = decodeURIComponent(document.cookie);
let ca = decodedCookie.split(‘;’);
for(let i = 0; i <ca.length; i++) {
let c = ca[i];
while (c.charAt(0) == ' ') {
c = c.substring(1);
}
if (c.indexOf(name) == 0) {
return c.substring(name.length, c.length);
}
}
return "";
}
document.getElementById('munchkinCookieInline').value = getCookie('_mkto_trk');

Google Sheets formulas for keyword management 

V LOOKUPCONCATENATEFLATTENLOWER

=VLOOKUP(text,[range to search],

,[true/false]) 

V LOOKUP (documentation)

VLOOKUP, which stands for “vertical lookup”, is arguably one of the very first Google Sheet formulas for SEO anyone learns when getting into the game.

VLOOKUP allows you to essentially combine two data sets on common values, an almost lowbrow JOIN in SQL if you will.

I generally use this formula to enrich information about keyword sets by adding search volume, PPC data or adding downstream metrics like signups. 

The end directive true/false specifies how exact you want the match to be, TRUE means not an exact match, and FALSE means exact matches only.

Tip: LOCK the range you’re searching against using $ ($E$3:$E$5 in the below example) so you can drag and carry the same formula across many rows.

=CONCATENATE(A1,A2,A3) 

CONCATENATE (documentation)

=CONCATENATE(A1,A2,A3) you have the option to concatenate columns=CONCATENATE(A1,” I’m additional text”) or literal words and characters

Concatenate is one of the most commonly used Google Sheet formulas in SEO, and for good reason.

It can serve a variety of use cases, including creating keyword lists (concatenating two+ variables together), creating URL strings, or even bulk templatizing metadata.

As the name suggests, you can use it to simply string any amount of values together.

Just remember: if you need a space between keywords, a literal space “ “ must be added.

=FLATTEN(range1, [range2, …]) 

FLATTEN (documentation)

=FLATTEN(A:D) would compress all ranges in A – D in to one column

There’s a reason FLATTEN is coming after concatenate. After you’ve concatenated several thousands of keywords and a couple of hundred dollars away, you generally need to upload the keywords into your rank tracking tool’s UI or via a CSV bulk upload.

It can be tedious when you have a 20×20 block of keywords to get them into a single column so you can upload all your keywords in one go.

With FLATTEN, you essentially select the range of data you want and the output is all of your keywords in one column to make copy-pasting a dream!

=LOWER(text) 

LOWER (documentation)

This one’s pretty simple – but it can be helpful to LOWERcase all the of the keywords you’re managing (especially if you use a service provider that charges for things like duplicates) or if you’re in a case-sensitive environment like SQL.

LOWER is admittedly one of the simplest Google Sheets formulas for SEO.

The opposite (UPPER) also works, should you feel like auto-capping everything. 

=COUNTIF(range,”[text or function]”) 

COUNTIF (documentation)

COUNTIF lets you count, with accuracy, any literal text you want to match or even some numerical values that meet conditional rules.

It’s particularly useful when grouping together pages, managing an upcoming content calendar or sorting keywords on common dimensions like the page type or product they support.

It can also be used with conditions to match values, such as ones that have CPCs > $10.00 or that have a search volume > 100 searches a month. 

=SUMIF([range to search],”[condition to match]”,[range to return]) 

SUMIF (documentation)

SUMIF is similar to COUNTIF, but is helpful if you’re trying to add up an additional metric associated with the group of interest, like summing up total keyword volume opportunities by themes or search volume by page type. 

Google Sheets formulas for internationalization

=GOOGLETRANSLATE(text, [“source_language” or “auto”, “target_language”])

GOOGLE TRANSLATE (documentation)

source_language = two-letter language code of the source language (or “auto” for Google to guess)target_language = two-letter* language code for your target language, like ES for Spanish

Ahh, one of my favorite and most loved Google Sheets hacks.

Rather than go back and forth to the Google Translate UI and risk carpal tunnel, you can bulk translate lists of keywords in seconds into one, or even multiple languages.

You even have the option to auto-select the origin language by changing source_language to “auto” to let G sheets choose for you (which usually works, usually).

Google doesn’t support translating into all “flavors” of languages (e.g., Canadian French), but supports languages like pt-pt and pt-br, as well as Chinese languages like zh-tw and zh-cn.

Google Sheets formulas for content/URL management

=SPLIT(text,[delimiter wrapped in “”])

SPLIT  (documentation)

Many times when you’re doing an analysis you might be working with data that is not in the required format you need.

There might be extraneous information that is separated (delimited) by things like commas (addresses), phone numbers (parenthesis and hyphens) and more.

While there’s a “split text to columns function” in the toolbar under “Data”, you can also split text that is delimited by a specific character, word or even spaces to individual columns with the SPLIT command directly in the sheet so you can quickly trim and tidy your keyword list.

=LEN(text) 

LEN  (documentation)

LEN is a simple Google Sheets formula for SEO you can use to simply count the characters in a line or string.

It can be most helpful when guiding people (both SEOs and non-SEOs) who are writing their own metadata, to stay within a “safe” enough character count so that it will hopefully not get truncated simply due to length.

=REGEXREPLACE(text, “regular_expression”, “replacement”)

REGEXREPLACE  (documentation)

Regexes are a powerful data mining tool when working on large websites.

If you’ve never even heard of regexes, you’ve probably not yet been challenged with an enterprise-level site.

I find myself using REGEXREPLACE most often when I’m cleaning up or trimming URLs in a sheet, where it can be helpful when I only need a path name minus domain or to manage redirects.

Google Sheets formula for dashboards

=SPARKLINE(B3:G3)

SPARKLINE  (documentation)

=SPARKLINE(B3:G3,{“charttype”,”line”; “color”,”indigo”; “linewidth”,2}) this version of sparkline is in indigo, with a slightly heavier weight

While BI tools like Tableau and Looker offer additional customizations, Google Sheets can be a cheap way to build simple dashboards.

The command SPARKLINE is capable of leveraging data to create simple visualizations in a Google Sheet.

A good amount of SEO and web data looks great on a time series, and Google Sheets can make it easy.

This is most helpful when you have data that is being actively updated inside of Google Sheets and need to skim 10+ trends quickly in one sheet.

A popular use case is to monitor trends like growth in several countries, campaigns or city-level basis. 

=SPARKLINE(B3:G3,{“charttype”,”line”; “color”,”[color you want]”; “linewidth”,2})

Time series/line charts

Time series is probably the most helpful for visualizing changes to traffic patterns over time and is suitable for monitoring most traffic trends and north star goals.

You can also remove the “line width” command, weight and even color for a quick and easy graph, but I find for time series I always need the line to be a little bolder and the contrasting color helps draw attention to the graph.

Column charts and bar charts
Sparkline even supports column and bar charts! Just change the chart type to column (shown below) or bar.

In more advanced use cases, most of the formulas above can be manipulated to have enhanced outputs, like automated conditional formatting or fun Unicode emoticon responses instead of nulls.

No matter how advanced you make them, using these formulas inside of Google Sheets is a great and cheap way to do basic SEO tidying work and keyword research.

The post 11 Google Sheets formulas SEOs should know appeared first on Search Engine Land.

Google documents how to inject canonical tags using JavaScript

Posted by on Jun 30, 2022 in SEO Articles | Comments Off on Google documents how to inject canonical tags using JavaScript

Google documents how to inject canonical tags using JavaScript

Google has updated its JavaScript SEO help document to add technical details on how to inject canonical link tags using JavaScript. Google added a new section titled “properly inject rel=”canonical” link tag.”

What is new. Here is the new section where Google recommends not to implement your canonical tags using JavaScript, but if you must, Google explains this is the proper way to do so. Google wrote:

While we don’t recommend using JavaScript for this, it is possible to inject a rel=”canonical” link tag with JavaScript. Google Search will pick up the injected canonical URL when rendering the page. Here is an example to inject a rel=”canonical” link tag with JavaScript:

Google added this warning stating “When using JavaScript to inject the rel=”canonical” link tag, make sure that this is the only rel=”canonical” link tag on the page. Incorrect implementations might create multiple rel=”canonical” link tag or change an existing rel=”canonical” link tag. Conflicting or multiple rel=”canonical” link tags may lead to unexpected results.”

Hit or miss. We have seen cases where Google can pick up these canonical tags or other embedded elements, even structured data, using JavaScript. But it can be hit or miss, so it is recommended that if you are going to use JavaScript specifically to inject your canonical tags, follow these directions precisely.

Why we care. Again, if you are injecting canonical tags using JavaScript, Google has finally officially documented the proper way to implement it. So check the documentation over here and make sure your implementation follows Google’s recommendations.

The post Google documents how to inject canonical tags using JavaScript appeared first on Search Engine Land.

Forecasting web traffic using Google Analytics and Facebook Prophet

Posted by on Jun 30, 2022 in SEO Articles | Comments Off on Forecasting web traffic using Google Analytics and Facebook Prophet

Forecasting web traffic using Google Analytics and Facebook Prophet

Ready to learn a quick-and-easy way to get traffic predictions for any amount of time in the future?

Seriously.

This article will show you how you can:

Predict traffic changes, and maybe even let your boss know when periods of stagnation or negative growth are to be expected.What to expect during times of increased or decreased traffic, so you could tell if your declines are in line with predictions, or if something might be going wrong and traffic is declining more than it should.Include a graph with an update to your boss or client of what’s coming in advance, so they know you aren’t just making excuses after the fact.

Want to skip the info and just click a few buttons? 

While we’ll be going through running the code to forecast your web traffic and what each of the sections does, you can skip this and jump right to the Colab here if you aren’t interested in knowing what’s going on and how to make adjustments.

For those who want to run the code locally and be able to edit the hyperparameters (a fancy name for some of the variables that do important things and generally have one value for a complete run of a model) let’s go!

Important note before you begin: The further ahead you ask it to predict, the wider the gap between the low and high estimates gets as the model becomes “less sure of itself.”

How to forecast your Google Analytics traffic

We’ll be using two systems to accomplish our goal:

UA Query Explorer: In this example, we’re going to use Universal Analytics for our forecasting. I will adjust the code in the Colab in about a year to GA4, but because it needs a year or more of data to really do the job, using UA, for now, makes the most sense and few people have GA4 data going back more than a year. UA Explorer is a tool that will quickly and easily generate the API URL that will pull our analytics for us.Facebook Prophet: Prophet is a forecasting model built and open-sourced by Facebook. It includes a lot of great built-in features, such as the ability to import holidays. It’s what’ll turn our analytics data into a forecast.

For those who wish to run locally, you can obviously do so, and the code provided will get the job done.

So, let’s dive in and get you predicting your future traffic!

1. Connect your instance

What this means is you’re “turning on” Google Colab so you can run scripts from it.

2. Import the needed libraries

The next thing we need to do is to import the libraries we need to make all this work. 

They are:

pandas – a Python library for data manipulation (to help us work with time-series data structures).numpy – needed to work with arrays (like our data and sessions array).matplotlib – we’ll be using this to create some visualizations.json – used to work with JSON data.requests – used to make HTTP requests (like pulling analytics data).fbprophet – used for time series forecasting.pystan – used to update probabilities. Like the probability of the traffic being X on a date in the future.

To run is all you need to do is click the play button.

You’ll see a bunch of downloads start and the play button turn into a spinning icon indicating it’s working, and when they’re done downloading and installing the play button with re-appear.

3. Sign up for Google Analytics demos & tools

You need to log in using the Google account tied to the analytics you want to access.

4. Configure the analytics you’re pulling

Next you need to select the account, property and view you want to pull your traffic data from.

Where it notes to pick a metric, you can pick from many of your traffic metrics depending on what you want to know. Examples might be:

Sessions (the one I use most)VisitorsUnique visitorsPageviews

Additionally, when you click the “segments” field a list of all the segments for the property (including custom segments) will display so you can select what traffic you want to look at.

After you’ve run the query just copy the API request URL:

5. Import analytics into the colab

Click the play button in the next cell:

You will be asked to enter the API query you just copied:

Paste it in and hit “Enter.”

You should be presented with a graph of the traffic over the data range you selected:

6. Formatting

The next cell just changes the column headings to what Facebook Prophet expects.

7. (Optional) Save

This step is completely unnecessary if you don’t plan on referencing back to the traffic numbers or forecasted numbers. I personally find it handy, but some won’t. 

The first thing you’ll track is simply the traffic numbers (same as you could export).

I promise it gets more interesting.

8. Adding holidays

The next step is to add holidays and to determine how seasonality is considered. There are some options and ways you can tweak things, or you can run it as is.

The decisions you need to make are:

What years do you want to pull the holidays for?What country do you want to pull the holidays for?

Additionally, you’ll notice the line:

m = Prophet(interval_width=0.95, yearly_seasonality=True, weekly_seasonality=True, daily_seasonality=False, seasonality_mode = “additive”, changepoint_range = 0.85)

You can change any of the parameters to suit your needs, though these settings should work decently in most scenario:

interval_width: This is how uncertain we’re willing to let the model be. Set to 0.95 it means that when training, 95% of all points must fit within the model. Set it too low, and it follows general trends but isn’t overly accurate. Set too high and it chases too many outliers and becomes inaccurate in that direction.yearly_seasonality: Monitors and responds to yearly trends.weekly_seasonality: Monitors and responds to weekly trends.daily_seasonality: Monitors and responds to daily trends.seasonality_mode: Set to either “additive” or “multiplicative”. Additive (the default) results in the magnitude of change being constant. You’d use this in most case to deal with things like holiday traffic spikes where the percentage increase vs pre-Black Friday is more-or-less steady. Multiplicative is used in scenario where there are growing surges. For example, in a growing town that sees an additional increase each year. Not only is there growth, but that growth gets larger with each interval.changepoint_range: A change point are points where the traffic changes significantly. By default the changepoint 

This is a tip-of-the-iceberg scenario. There are other parameters you can review and apply as you feel so inspired. Details on them are available here.

I’ve set things here to what seems to work well for me in most (but not all cases). 

Yearly and monthly seasonality impact most businesses. Daily, not so much.

9. Crunch the numbers

Thankfully you don’t have to do it. 

Simply click the run button.

And you’ll soon see:

Not all the rows or columns are showing. If they were, what you’d see is:

The highest number the model predicts likely (yhat_upper).The lowest (yhat_lower).The predicted value (yhat).

Importantly, you’ll see “periods=90” in the code above. That is the number of days I’m going to get predictions for.

I’ve found 90 works decently. After that, the range gets pretty large between high and low but can be interesting to look at.

10. (Optional) Save predictions

This is an optional step for those who would like to save their predicted values, or use them to test against different parameter values (those discussed in step eight above).

Once run, you’ll just click the link:

Which takes you to:

Each time you run it your numbers and results will be stored and can be easily accessed at a future time to compare with different runs.

It will also give you the numbers to reference if you’re ever asked for a predicted value for a specific day.

11. The magic

Hit the run bottom and you get what you’ve likely come here to get.

Optional

I’ve added an extra Insights section. It simply displays the impact of some of the areas we’ve been discussing above.

You can see in the top chart, where the different change points are. Further down you get insights into how the different seasonal trends are impacting the predictions, etc.

Closing

I’ve always looked for ways to predict in advance what’s coming my way.

It’s always better to show your boss or client that a slowdown is expected a week before it happens rather than try to explain it after the fact. 

Additionally, this insight can also help you plan your strategy.

Your work may be different when in your peak traffic points, than it is when you’re in a lull. You can look back over your analytics trends month-by-month, year-by-year and try to piece it together – or just let machines do what machines do best.

Just a reminder, if you got to the bottom and wanted to get to the Colab to run this yourself, you’ll find it here.

The post Forecasting web traffic using Google Analytics and Facebook Prophet appeared first on Search Engine Land.

3 ways to dominate with Google Auction Insights and search intelligence

Posted by on Jun 30, 2022 in SEO Articles | Comments Off on 3 ways to dominate with Google Auction Insights and search intelligence

3 ways to dominate with Google Auction Insights and search intelligence

While marketers have overcome many challenges in recent years, sadly, the second half of 2022 is poised to be very different from the first. Unprecedented macroeconomic factors such as high inflation, food and energy costs and the war in Ukraine are impacting our business.

Hiring is starting to slow down, and the pressure is on. There is a good chance that you’ll probably be asked to do more with less, as budgets may be prioritized and cut in certain areas. 

On the flip side, Google Search ad spend continues to rise. It’s the channel that is always on, the channel with the highest conversion rate and the channel that won’t go away.

This part of the marketing mix is reliable and constant, but are the campaigns delivering success? Are they contributing to sales? Contributing to leads?

Auction Insights is a powerful tool we’ve all come to use for understanding campaign performance against competitors. Search intelligence adds another layer of granularity to ensure you’re one step ahead of your competition.  

Join Ashley Fletcher, VP of Marketing at Adthena, in his informative SMX Advanced session to explore three easy search intelligence tactics that will help you dominate your competitive landscape. He also shares use-cases from L’Oreal and Avanti West Coast trains.  

After this session, you’ll be able to save time with competitive monitoring, track performance over time and see your competitor’s spend and ad copy. The presentation will help you use data to make better ad campaign decisions and dig into search intelligence to understand why certain ads are successful to ultimately dominate the competition.

The post 3 ways to dominate with Google Auction Insights and search intelligence appeared first on Search Engine Land.

Bot traffic: What it is and why you should care about it

Posted by on Jun 30, 2022 in SEO Articles | Comments Off on Bot traffic: What it is and why you should care about it

Bot traffic: What it is and why you should care about it

Bots have become an integral part of the digital space of today. They help us order groceries, play music on our Slack channel, and pay our colleagues back for the delicious smoothies they bought us. Bots also populate the internet to carry out the functions they’re designed for. But what does this all mean for website owners? And perhaps more importantly, what does this mean for the environment? Read on to find out what you need to know about bot traffic and why you should care about it!

What is bot traffic?

To begin, a bot is a software application created to perform automated tasks over the internet. Bots can imitate or replace the behavior of a real user. They’re very good at executing repetitive and mundane tasks. They’re also swift and efficient, which makes them a perfect choice if you need to do something on an enormous scale.

Bot traffic refers to non-human traffic to a website or app. If you own a website, you’ve likely been visited by a bot. Bot traffic accounts for more than 40% of the total internet traffic in 2022. We’ve seen this number rising in recent years, and we will continue to see this trend in the foreseeable future.

Bot traffic gets a bad name sometimes, and in many cases, they are indeed bad. But there are good and legitimate bots too. It depends on the purpose of those bots. Some bots are essential for operating digital services like search engines or personal assistants. Some bots want to brute-force their way into your website and steal sensitive information. So which are the ‘good’ bot activities and which ones are ‘bad?’ Let’s go a bit deeper into these two kinds of bots.

The ‘good’ bots

The ‘good’ bots carry out specific functions that do not cause harm to your website or server. They announce themselves and let you know what they do on your website.

The most popular bot of this type is probably search engine crawlers. Without crawlers visiting your website to discover content, search engines would have no way to serve you information when you search for something. When we talk about ‘good’ bot traffic, we’re talking about these bots. It’s perfectly normal for a site to have a small percentage of traffic coming from ‘good’ bots. Other than search engine crawlers, some other good internet bots include:

SEO crawlers: If you’re in the SEO space, you’ve probably used tools like Semrush or Ahrefs to do keyword research or gain insight into competitors. For those tools to serve you information, they also need to send out bots to crawl the web to gather data.Commercial bots: Commercial companies send these bots to crawl the web to gather information. For instance, research companies use them to monitor news on the market; ad networks need them to monitor and optimize display ads; ‘coupon’ websites gather discount codes and sales programs to serve users on their websites.Site-monitoring bots: They help you monitor your website uptime and other website metrics. They periodically check and report data such as your server status and uptime duration so you can take action when something’s wrong with your site.Feed/aggregator bots: They collect and combine newsworthy content to deliver to your website visitors or email subscribers.

The ‘bad’ bots

The ‘bad’ bots are created with malicious intentions in mind. You are probably familiar with spam bots that spam your website with non-sense comments, irrelevant backlinks, and atrocious advertisements. You’ve probably also heard of bots that take people’s spots in online raffles or those that buy out the good seats in concerts.

Because of these malicious bots, bot traffic gets a bad name. Unfortunately, a significant amount of bot traffic comes from such ‘bad’ bots. It is estimated that bad bot traffic will account for 27.7% of internet traffic in 2022. Here are some of the bots that you don’t want on your site:

Email scrapers: They harvest email addresses and send malicious emails to those contact.Comment spam bots: Spams your website with comments and links that redirect people to a malicious website. Or in many cases, they spam your website to advertise or to try to get backlinks to their sites.Scrapers bots: These bots come to your website and download everything they can find. That can include your text, images, HTML files, and even videos as well. Bot operators will then re-use your content without permission. Bots for credential stuffing or brute force attacks: These bots will try to gain access to your website to steal sensitive information. They do that by trying to log in like a real user.Botnet, zombie computers: They are networks of infected devices used to perform DDoS attacks. DDoS stands for distributed denial-of-service. During a DDoS attack, the attacker uses such a network of devices to flood a website with bot traffic. This overwhelms your web server with requests, resulting in a slow or unusable website.Inventory and ticket bots: They go to websites to buy up tickets for entertainment events or to bulk purchase newly-released products. Brokers use them to resell tickets or products at a higher price to make profits.

Why you should care about bot traffic

Now that you’ve got some knowledge about bot traffic let’s talk about why you should care about it.

For your website security and performance

We’ve discussed several types of bad bots and their functions. You do not want malicious bots lurking around your website. They will undoubtedly wreak havoc on your website performance and security.

Malicious bots disguise themselves as regular human traffic, so they might not be visible when you check your website traffic statistics. That can hurt your business decisions because you don’t have the correct data. You might see random spikes in traffic but don’t understand why. Or you might be confused as to why you receive traffic but no conversion.

Next to this, malicious bot traffic strains your web server and might sometimes overload it. These bots take up your server bandwidth with their requests, making your website slow or utterly inaccessible in case of a DDoS attack. In the meantime, you might have lost traffic and sales to other competitors.

And malicious bots are bad for your site’s security. They will try to brute force their way into your website using various username/password combinations or seek out weak entry points and report to their operators. If you have security vulnerabilities, these malicious players might even attempt to install viruses on your website and spread those to your users. And if you own an online store, you will have to manage sensitive information like credit card details that hackers would love to steal.

For the environment

Let’s come back to the question at the beginning of the post. You need to care about bot traffic because it affects the environment more than you might think.

When a bot visits your site, it makes an HTTP request to your server asking for information. Your server needs to respond to this request and returns the necessary information. Whenever this happens, your server must spend a small amount of energy to complete the request. But if you consider all the bots on the internet, then the amount of energy spent on bot traffic is enormous.

In this sense, it doesn’t matter if a good or bad bot visits your site because the process is still the same. They both use energy to perform their tasks, and they both have consequences on the environment. Even though search engines are an essential part of the internet, they are guilty of being wasteful too.

You know the basics by now, search engines send crawlers to your site to discover new content and refresh old ones. But they can visit your site too many times and not even pick up the right changes. We recommend checking your server log to see how many times crawlers and bots visit your site. A crawl stats report in Google Search Console also tells you how many times Google crawls your site. You might be surprised by some numbers there.

A small case study from Yoast

Let’s take Yoast, for instance. On a given day, Google crawlers can visit our website 10,000 times. It might seem reasonable to visit us a lot, but they only crawl 4,500 unique URLs. That means energy was used on crawling the duplicate URLs over and over. Even though we regularly publish and update our website content, we probably don’t need all those crawls. These crawls aren’t just for pages; crawlers also go through our images, CSS, JavaScript, etc.

But that’s not all. Google bots are not the only ones visiting us. There are bots from other search engines, digital services, and even bad bots. Such unnecessary bot traffic strains our website server and wastes energy that could otherwise be used for other valuable activities.

Statistic on the crawl behaviors of Google crawlers on Yoast.com in a day

What to do against ‘bad’ bots

You can try to detect bad bots and block them from entering your site. That will save you a lot of bandwidth and reduce strain on your server, which in turn helps save energy.

The most basic way to do this is to block an individual or an entire range of IP addresses. You should block that IP address if you identify irregular traffic from a source. This approach works, but it’s labor-intensive and time-consuming. Alternatively, you can use a bot management solution from providers like Cloudflare. These companies have an extensive database of good and bad bots. They also use AI and machine learning to detect malicious bots and block them before they can cause harm to your site.

You should install a security plugin if you’re running a WordPress website. Some of the more popular security plugins (like Sucuri Security or Wordfence) are maintained by companies that employ security researchers who monitor and patch issues. Some security plugins automatically block specific ‘bad’ bots for you. Others let you see where unusual traffic comes from and decide how to deal with that traffic.

What about the ‘good’ bots

As we mentioned earlier, the ‘good’ bots are good because they are essential and transparent in what they do. But they can consume a lot of energy while performing their tasks, which impacts the environment. Not to mention, these good bots might not even be helpful for you. Even though what they do can be considered ‘good,’ they might even bring disadvantages to your website and, ultimately, to the environment. So what can you do for the good bots?

1. Block them if they are not useful

You need to think and decide whether or not you want these ‘good’ bots to crawl your site. Do them crawling your website benefit you? And, significantly, do them crawling you benefit more than the cost to your servers, their servers, and the environment?

Let’s take search engine bots, for instance. You know that Google is not the only search engine out there. It’s most likely that crawlers from other search engines have visited you. Let’s say you check your server log and see that a search engine has crawled your site 500 times today, but it only brings you ten visitors. If that’s the case, would it be useful to let bots from that search engine crawl your site? Or should you block them because you don’t get much value from this search engine?

2. Limit the bot’s crawl rate

If they support the crawl-delay in robots.txt, you should try to limit their crawl rate, so they don’t come back once every 20 seconds and crawl the same links over and over. This is very useful for medium to large websites that crawlers often visit. But small websites also benefit from using crawl delays. Most likely, you don’t update your website content 100 times on a given day, even for larger websites. And if you have copyright bots visiting your site to check for copyright infringement, do they need to come every few hours?

You could play with the crawl rate and monitor its effect on your website. And you can assign a specific crawl delay rate for crawlers from different sources. Start with a slight delay and increase the number when you’re sure it doesn’t have negative consequences. Unfortunately, Google doesn’t support craw delay, so you don’t need to set this for Google bots.

3. Help them crawl more efficiently

You can decide which parts of your site you don’t want bots to crawl and block their access via robots.txt. This not only saves energy but also helps to optimize your crawl budget.

There are a lot of places on your website where crawlers have no business coming. That can be your internal search results, for instance. Nobody wants to see those on public search engines. Or, if you have a staging website, you probably don’t want people to find it.

Next, you can help bots crawl your site better by removing unnecessary links that your CMS and plugins automatically create. For instance, WordPress automatically creates an RSS feed for your website comments. Of course, this RSS feed has a link. But hardly anybody looks at it anyway, especially if you don’t have a lot of comments. Hence, the existence of this RSS feed might not bring you any value. It just creates another link for crawlers to crawl repeatedly, wasting energy in the process.

Optimize your website crawl with Yoast SEO

We’ve recently launched a feature in Yoast SEO Premium that lets you optimize your website to make it easier for crawlers to crawl your site. Within the crawl settings in Yoast SEO Premium, you’ll find many toggles that let you turn off various things WordPress automatically adds to your site that most sites won’t miss.

At the moment, there are 20 toggles available in the crawl setting. We’ve added a lot more options since the feature was first released in Yoast SEO Premium 18.6. It’s good to know this is currently in beta. We will be working hard to improve this feature and add more settings to help you optimize your site’s crawlability. Check out this page to learn more about our crawl feature!

The post Bot traffic: What it is and why you should care about it appeared first on Yoast.

Google Maps adds new store location feature, Locator Plus, Reserve with Google integration, new analytics and more

Posted by on Jun 29, 2022 in SEO Articles | Comments Off on Google Maps adds new store location feature, Locator Plus, Reserve with Google integration, new analytics and more

Google Maps adds new store location feature, Locator Plus, Reserve with Google integration, new analytics and more

The Google Maps Platform has added a bunch of new features for businesses to add deeper integration with its Google Maps Platform APIs to help those businesses streamline some of their integrations with Google Maps. These include the ability to embed Reserve with Google on your site, a new embed a store location tool, a Locator Plus feature, store locator analytics and more.

Reserve With Google Embed Feature

We have seen Reserve with Google feature in Google Search and Google Maps for a while now. Now, Google is allowing businesses to use the Reserve with Google feature on their own sites or portals.

Google added new APIs to allow you to embed the Reserve with Google on your site and your own maps. Google said this allows for “end-to-end appointment booking flow, and connects users to a variety of services.” It allows your customers who are using your store locator an option for bookings, right within the locator, which creates an easier booking process.

Here is how it works:

Store Locator Embed Feature

Google also has rolled out a way to manage and publish your store locator using the Google Maps platform. Google said this will allow you to “quickly update and roll out your store locator” on your site. In the Locator Plus solution, you can capture the location of every single store you want to show users – all within one map, Google explained. This can now be done with a simple copy and paste of some embed code and very little API development coding.

Google Locator Plus

Similar to the store locator feature, the new Locator Plus feature allows businesses to easily import business details from your Google Business Profile. This will allow the business details your already have in your Google Business Profile to be reflected in the store locator on your website. The details include include hours, contact information, photos, service options, and more.

Google Store Locator Analytics

With all these announcements, Google also is rolling out a new Google store locator analytics dashboard. This analytics dashboard should help you better understand the impact of your implementation and generate insights from your data. It shows you how well your site visitors are engaging with your store locator, Google said.

The dashboard helps you measure your performance week over week, including number of views, number of interactions with Google Search and Google Place Details, and overall engagement rate. The dashboard uses anonymized data to provide important benchmarks on how a developer’s implementation compares against other developers using the same solution.

Why we care. All these tools can be useful for large and small businesses to manage their local presence not just on Google but also on their own site. Plus, searchers are accustomed to Google and may find these embed features familiar and be more likely to use them.

At the same time, this is putting more and more of your data in Google, making you more reliant on Google for management and hosting of this features and data. So keep that in mind before implementing these on your site.

The post Google Maps adds new store location feature, Locator Plus, Reserve with Google integration, new analytics and more appeared first on Search Engine Land.

Webinar: Work smarter, not harder, to give customers what they want

Posted by on Jun 29, 2022 in SEO Articles | Comments Off on Webinar: Work smarter, not harder, to give customers what they want

Webinar: Work smarter, not harder, to give customers what they want

Personalizing your marketing campaigns for one customer is easy, but how about one hundred or thousands of customers across multiple marketing channels?

Work smarter, not harder, by using artificial intelligence (AI) as part of your martech stack and giving your customers the unique experiences they crave.

Register today for “Use Data to Create Next-Level Customer Experiences at Scale,” presented by MoEngage.

The post Webinar: Work smarter, not harder, to give customers what they want appeared first on Search Engine Land.

Warby Parker dodges 1-800 Contacts lawsuit over search results, website

Posted by on Jun 29, 2022 in SEO Articles | Comments Off on Warby Parker dodges 1-800 Contacts lawsuit over search results, website

Online retailer Warby Parker was sued by 1-800 Contacts over it’s use of the latters branded keywords to redirect searchers to the Warby Parker online store. The case was dismissed by a Manhattan federal judge saying that Warby Parkers’ ads are unlikely to confuse potential customers.

The decision. Judge Kevin Castel argued against 1-800 Contacts saying that customers are unlikely to think that they’re buying from 1-800 Contacts when they click on a Warby Parker ad. Castel also said the companies’ trademarks were too dissimilar to confuse contact-lens buyers, who are likely to pay close attention to what they are purchasing and noted that Warby Parker’s name is clearly displayed in the search results and on its website.

Castel added that prospective customers will take the time to figure out that the search results link to Warby Parkers website, and will therefore discern that they are buying from contacts from Warby Parkers website. 

1-800 Contacts response. A spokesperson for 1-800 Contacts said after the ruling that the decision by the judge was “inconsistent with several well-established legal principles,” and that the company is “evaluating appropriate next steps, including whether to appeal.”

Sounds familiar. Earlier this year we reported on an attempt by Edible Arrangements to sue Google over theft, conversion, and racketeering. Edible Arrangements lost that lawsuit, but this was not the first case courts heard over trademark and copy issues. 

In my own Google search, I was unable to mimic the results that this suit was based on and didn’t find any Warby Parker ads initially. You can read the article and ruling from Reuters here.

Why we care. The dismissal of this case against Warby Parker should concern advertisers who are competing for branded keywords. If you’re facing a similar issue, you can visit the Google trademark help document, but it can be a painstakingly long, and temporary band-aid for a much bigger issue. On the contrary, if you’re using another brand’s keywords in your ad strategy, be careful, as you could see yourself in hot water. 

The post Warby Parker dodges 1-800 Contacts lawsuit over search results, website appeared first on Search Engine Land.

FCC wants Apple and Google to remove TikTok

Posted by on Jun 29, 2022 in SEO Articles | Comments Off on FCC wants Apple and Google to remove TikTok

US Federal Communications Commissioner Brendan Carr is requesting that Apple and Google ban TikTok from their app stores. The request is due to the social media apps “pattern of surreptitious data practices.”

TikTok is not just another video app.
That’s the sheep’s clothing.

It harvests swaths of sensitive data that new reports show are being accessed in Beijing.

I’ve called on @Apple & @Google to remove TikTok from their app stores for its pattern of surreptitious data practices. pic.twitter.com/Le01fBpNjn

— Brendan Carr (@BrendanCarrFCC) June 28, 2022

The request comes after BuzzFeed News reported that US data had been accessed from China. TikTok had been adamant for years that any data access from US users had been kept in the US. But according to leaked audio, employees of TikTok have been able to access nonpublic data about TikTok users.

“As you know TikTok is an app that is available to millions of Americans through your app stores, and it collects vast troves of sensitive data about those US users. TikTok is owned by Beijing-based ByteDance — an organization that is beholden to the Communist Party of China and required by the Chinese law to comply with PRC’s surveillance demands,” Carr said in a letter addressed to Sundar Pichai and Tim Cook.

“It is clear that TikTok poses an unacceptable national security risk due to its extensive data harvesting being combined with Beijing’s apparently unchecked access to that sensitive data.”

TikTok’s response. After the revelations, TikTok was quick to respond that it was moving all US users’ data to Oracle servers. They add that they expect to delete all US user data from their own centers. 

This sounds familiar. In 2020 then President Donald Trump signed an executive order banning TikTok from the US, citing an investigation in 2019 over national security concerns. TikTok responded saying that the EO was issued without any due process. Last year, President Biden revoked the ban and replaced the order with requirements that the Commerce Department review apps that may pose national security risks. 

TikTok, Apple, and Google have not yet responded to the request. 

Why we care. Apparently, only the US can access US user data. In all seriousness, the calls for a ban seem to be hypocritical at best. While we understand the need to limit national security threats from foreign adversaries, where do we draw the line? Will all foreign apps be required to keep their US data in the country? 

The post FCC wants Apple and Google to remove TikTok appeared first on Search Engine Land.