SEO Articles

How to use the readability analysis

Everybody knows the colored bullets in Yoast SEO. Two parts of the plugin use this traffic light system: the content analysis and the readability analysis. The first checks whether your post is SEO-proof, while the latter checks if it is readable for a general audience. Of course, these two are interconnected, as readable content is incredibly important if you want your site to do well in the search results. Here, I’ll show you how to use the readability analysis.

What does the readability analysis in Yoast SEO do?

The readability analysis uses an algorithm to determine how readable your post is. We’ve carefully crafted this algorithm to make it as accurate as possible without being too strict. It features several checks that’ll give you advice when you write your post. In other words, by following the advice, you can make your text easier to read and understand.

It has been said that Yoast SEO suggests to dumb down your writing. Of course, that’s not the case. We merely want to help people write easy to understand content. I always come back to this quote by content designer Sarah Richards about making your content as readable for humans as possible:

“You’re not dumbing down, you’re opening up.”

By simplifying content, you’re automatically growing your audience, as more people grasp the message of your content. Also, you’re not writing your content just for people anymore. Virtual assistants like Alexa and Siri have to be able to work with it as well. And even Google increasingly uses well-written pieces of content for rich results like featured snippets.

That being said, while the advice in the readability section is not the be-all and end-all advice, it does give you important clues to the perceived difficulty of your text. It is crucial to write with readability in mind, as we think readability ranks!

Current readability checks

At the moment, Yoast SEO uses the following checks:

  • Transition words: Do you use transition words like ‘most importantly’, ‘because’, ‘therefore’, or ‘besides that’ to tie your text together? Using these words improves the flow of your article as they provide hints to the reader about what is coming next.
  • Sentence beginnings: Do any of your consecutive sentences start with the same word? This might feel repetitive to your reader, and that can be annoying. Always keep your sentences varied, so your article is readable and free of obstacles. Unless you want to prove something or use it as a writing style, of course.
  • Flesch reading ease: This world-famous test analyzes texts and grades them on a scale from 1-100. The lower the score, the more difficult to read the text is. Texts with a very high Flesch reading ease score (about 100) are very easy to read. They have short sentences and no words of more than two syllables. Usually, a reading ease score of 60-70 is believed to be acceptable/normal for web copy.
  • Paragraph length: Some people tend to use extremely long paragraphs. Doing so makes your text look daunting as it becomes just one big blob of text. Break it up, use shorter paragraphs and don’t forget to give your core sentences some thought.
  • Subheading distribution: Similarly to long paragraphs, texts without subheadings are difficult to scan, which makes them rather daunting. So, we check if you use enough subheadings to guide your readers through the text and help them find what they’re looking for.
  • Sentence length: Sentence length is one of the core aspects that can make a text hard to read. If most of your sentences are too long – over 20 words – people lose track of your point. Readers often have to jump back a few words to find out what you mean. This very tiring and inefficient. Try to keep the number of words in a sentence in check. Shorten your sentences. Aim for easy understanding, not a complex literary masterpiece.
  • Passive voice: Using a lot of passive voice in your text makes it appear distant, and your message will be less clear. Your sentences become wordy and difficult because the sentence structure is harder to understand. Whenever you use the passive voice, always consider whether a better, active alternative is available.

Supported languages

The readability analysis is available in English and several languages, such as German, French, Spanish, and Russian. Check out the features per language for an overview. We’re continually working on adding new languages.

How to use the readability analysis in Yoast SEO

It’s very easy to use the readability analysis in Yoast SEO to improve your content. Personally, I just start writing the article I want to write. I keep the audience I’m writing for in the back of my head and try to use the words they would use. Although the readability score is calculated in real time, I won’t look at the score during the writing process. Only after (the draft of) my article is finished, I’ll check the readability score and see if I have to fix anything. If I get an orange or red bullet, I can click on the eye icon to jump to the spot where improvements can be made. Easy peasy!

Everyone has their own writing and editing process, and my way isn’t necessarily how you should use it. For instance, you might be targeting a Flesch level of 80. If so, you have to find out what works gradually. When using the readability tool for a while, you’ll notice that you’ll automatically get a feel for the text level you are aiming for. Practice makes perfect!

The readability checks in Yoast sEO

Should all bullets be green?

This is a question we often get and no, not every bullet has to be green. What you should aim for, though, is a green, happy bullet overall – the one in the tab that reads “Readability”. Having an orange bullet for one of the checks, like in the screenshot above, is ok. It’s not that your article won’t be able to rank if it doesn’t pass all of the tests. This is merely an indication, not a necessity.

We want everyone to be able to read and understand content, but we also know that there are industries where the language used is totally different from what ordinary people would use. That’s perfectly fine. Find out what works in your case. Need help? Please read our ultimate guide to SEO copywriting.

Try it out!

The readability and content analyses of Yoast SEO help you to write excellent, SEO-proof articles that are easy to grasp for anyone. In doing so, you make sure that every piece of content you write is ready to start ranking in search engines, while staying enjoyable for readers. Don’t have Yoast SEO yet, or want to take advantage of the awesome additional features our Premium plugin offers? What are you waiting for?

Read more: How to use the content & SEO analysis of Yoast SEO »

The post How to use the readability analysis appeared first on Yoast.

Read More

5 Ways to Use Google Search Console (Like a Pro)

Google Search Console is a fundamental tool for every successful SEOs toolkit.

The best part?

It’s free!

In this guide, I’ll show you how to use Google Search Console to improve your SEO performance, so you can get more traffic, leads, and customers from organic search.

Let’s jump in.

Before I show some cool tactics, I need to cover the basics.

How to Set Up Google Search Console

1. Go here and enter your email address.

Google Search Console Homepage

2. Click the dropdown on the upper lefthand side.

Step 1

3. Click “Add Property”

Step 2

4. Select the “Domain” option and enter your root domain (example: gotchseo.com). Then click “Continue”.

Step 3

5. Copy the txt record and sign in to your registrar (where you purchased your domain).

Step 4

6. If you’re using GoDaddy, click on “My Products”, look under the “Domains” section to find your domain, and then click on “DNS”.

Step 5

7. Click on “Add” under “Records”.

Step 6

8. Select “TXT”. Enter “@” under “Host”, enter the TXT record you copied from Google Search Console under “TXT Value”, and click “Save”.

Step 7

9. Go back to Google Search Console and click “Verify”. You may end up seeing the “Ownership verification failed” message like this:

Ownership Verification Failed Google Search Console

10. Google recommends waiting a day and then trying to verify again. In most cases, you’ll see the “Ownership auto verified” message like this:

Ownership Auto Verified Google Search Console

The last step is to integrate Google Search Console data with your site’s Google Analytics data.

How to Integrate Google Search Console Data with Google Analytics

1. Go to Google Analytics and click on the target website. Then click on “Acquisition”, “Search Console”, and then “Landing Pages”.

Step 1

You’ll see this screen (click “Set up Search Console data sharing”):

Step 2

If you don’t see your domain on the list, then click on “Add a site to Search Console”.

If you dont see site

As of right now, this is a glitch.

For some reason, when you add a site on the new Google Search Console, it doesn’t add to the old version. Google Analytics is integrated with the old version, so it’s causing some issues.

That said, add the target site to the old version and then go back and refresh the page. It should be showing now.

Select it, make sure it matches the “Web Property” at the top, and click save.

Select site

Click “OK” when you see the “Add association” pop up.

Add Association

Go back to Google Analytics and refresh the page. It should now be integrated.

Integrated

Keep in mind that it will take a few days to start showing data inside Google Analytics.

Now that you’re all set up, let’s jump into how to use this amazing free tool.

5 Ways to Use Google Search Console to Increase Your Traffic

  1. Optimize Crawling and Indexing
  2. Identify Low Hanging Fruits
  3. Increase Organic Search CTR
  4. Perform CRO
  5. Track Branded Search Performance

Optimize Crawling and Indexing

The first way to use Google Search Console is to use the URL Inspection tool.

The URL Inspection tool is useful because you can check the indexation and mobile-friendliness of any URL on your website. Copy any URL and enter into the search bar:

URL inspection tool

You’ll end up on this page and the goal is to have green checkmarks for every option.

URL is on Google

Here’s how it might look if Google hasn’t crawled and indexed a page on your site:

URL is not on Google

What do you do in this scenario?

First, do not “Request Indexing”.

If your page isn’t getting crawled and indexed there’s a reason (or many reasons). You need to audit your site to identify what’s preventing Google from either crawling or indexing your pages.

Let’s start with crawling because Google can’t index a page unless it can crawl it.

There a few possible reasons why Google can’t crawl a page:

  • Your robot.txt file is blocking Google’s crawlers.
  • Your page is buried within your site’s architecture that Google’s crawlers can’t it (or have given up).
  • Your website’s loading speed is too slow and Google’s crawlers give up.

If Google is crawling your site, but your pages aren’t indexed, then it might be because:

  • You’re using the “noindex” tag
  • Your site architecture is poorly structured
  • Your page is slow
  • Your page is unresponsive
  • Your website rarely publishes new content

And many other reasons outside the scope of this guide. The good news is that you can actually use Google Search Console to find some of these issues.

Let’s move onto the “Index” section. Click on “Coverage” and this section will show you every technical issue that Google has found.

Coverage section

If you’re having indexation issues, then see if you have an obvious “Errors” such as “Submitted URL marked ‘noindex’”.

Submitted URL marked noindex

Click through and make sure you actually want these pages to be noindexed.

Affected Pages

Otherwise, remove the noindex tag and Google will crawl and then index it.

If you don’t find the suspect URL in this section, go back to the “Coverage” overview section. Then click on “Excluded”. Scroll down and click on “Excluded by “noindex” tag.

excluded by noindex tag

For example, I want my “Story” page to be indexed in Google, but it’s using the “noindex” tag by accident.

Excluded

If you click on the URL, Google Search Console will give you two options:

  1. Inspect URL
  2. Test Robot.txt Blocking

Two crawl options

Start with “Test Robot.txt Blocking” and see your robot.txt is blocking Google’s crawlers (it will take you to the old version of Google Search Console).

robots txt tester

If it passes the test, move onto the “Inspect URL” option. Make sure you have removed the “noindex” tag from the target page and then click “Request Indexing”.

Request Indexing

You should see the “Indexing requested” confirmation popup.

Indexing Requested

Now just wait a few days (maybe even a week) to see if the page is indexed.

The “Coverage” section is robust and there are many technical issues you can tackle. I recommend digging through the “Excluded” section and fixing each issue one-by-one.

Excluded Section

One other issue you’ll want to look for isn’t as obvious. It’s called index bloat.

This is a very common problem when I’m conducting SEO audits. In short, “index bloat” is when you have pages indexed in Google that shouldn’t be.

This can cause crawl issues, duplicate, and thin content issues, and it can even dilute your site’s authority. I recommend exporting the URLs from the “Valid” and “Submitted and indexed” section.

Valid

The best way to decide if pages should be indexed is by using a combination of data and manual analysis.

Check out the video below on how to perform a content audit. I use Screaming Frog SEO Spider in the demonstration, but the general thought process and nuance will apply no matter what tool you’re using.

Subscribe on YouTube for more free SEO training videos.

The last thing you need to do in the “Coverage” section is to make sure you’ve submitted a sitemap.

Submit Sitemap

Now let’s move onto the “Performance” section.

Identify Low Hanging Fruits

Google Search Console’s “Performance” section is where all the magic happens.

Performance

If you’ve had it installed on your site for a while, you have tons of critical data at your fingertips.

I’m not going to bore you and show you how to look at the data.

Instead, I’m going to show you how to leverage this data to get more organic search traffic.

The first method is to identify low hanging fruits.

A “low hanging fruit” is any keyword phrase ranking from positions #11 – #20.

These keyword phrases are only a few tweaks away from landing on the first page. You know this, but being on the second page of Google is almost like being completely invisible.

To find these low hanging fruits, click on “Average position”.

Average position

Then scroll down and click the filter option.

Filter Position

Check “Position”, select “Greater than” from the dropdown, and enter “11”.

Greater than

These keyword phrases are your low hanging fruits. I recommend going after phrases with the highest volume.

Now the question is:

How?

The fastest method is to make sure the phrase is mentioned on the page. If it’s a high volume keyword, then you may need to create another section on the page.

Google is telling you what keywords should be targeted on that page. Take advantage of it!

First, click on the target keyword phrase.

query

Then click on the “Pages” tab. This tab will show you what page on your site is ranking for that keyword.

Pages tab

Second, view the page and search for the keyword phrase. “Backlink builder” isn’t mentioned once on my guide about backlinks.

Search

That means that the first step is to figure out how to integrate that phrase onto the page. I recommend searching the exact phrase in Google to see what the intent is.

In this case, 7 out of the 10 results are tools.

SERP example

That means it might make sense for me to add a section about “Free Backlink Builder Tools”. I could also reframe it to show a list of the “Top Backlink Builders”.

The key takeaway is to model the search intent for the keyword. In some cases, you can add the keyword variation a few times in the copy (read this guide about on-page SEO).

Once you’ve optimized the page for long-hanging fruits, annotate inside Google Analytics.

Then wait a few weeks. You can go back to Google Search Console and see how that keyword phrase is performing by comparing date ranges.

Click the “Date” filter option. Then click on “Compare” and select the appropriate dates.

Date Filter

You can then see how the page has performed since you made the changes.

Comparison

If it hasn’t produced any movement, then reassess your optimization and content strategy.

If you feel that both categories are on-point, then I recommend examining the UI/UX, your site’s architecture leading to that page, and the backlink profile for that page.

Increase Organic Search CTR

The next way to leverage Google Search Console data is to increase your organic search Click Through Rate (CTR).

There is no faster way to get more organic search traffic than increasing your CTR. Here’s how to do it:

Go to the “Search Results” section, select “Average CTR” and “Average Position”.

Average CTR

Then scroll down and click on the filter button. Select “CTR”, click “Smaller than” from the dropdown, and enter “1.0”.

smaller than 1

Then go back to the filter and select “Position”, click on “Smaller than”, and enter “10”.

smaller than 10

Now you should be looking at keyword phrases that you’re performing well for, but your CTR is lacking.

Bad CTR

Now there are a few things to consider before I explain how to optimize for CTR.

Here are 4 Factors That Will Impact Organic CTR:

  1. your position (lower rankings = lower CTR)
  2. Google Ads (more ads = lower CTR)
  3. SERP features (more SERP features = lower CTR)
  4. search intent

Search intent isn’t as obvious as the others. In general, navigational search phrases (like “Gotch SEO”) that you don’t own will have low CTR. For example, my CTR for “blogger.com” is a brutal 0.1%.

I can push my rankings up further for this phrase, but I know it’s a waste of time and resources. Why?

Because people searching navigational phrases are generally looking for the brand itself. The takeaway is that you prioritize increasing organic search CTR for non-navigational keywords.

That means you should focus on informational keywords like “SEO competitive analysis” or “affordable SEO service”.

Now let me explain how to actually increase your organic search CTR.

How to Increase Organic CTR

Select an informational keyword with low CTR. You should pick a keyword that has low CTR and a high position. Sort the data by “Position” to see the top-ranked keywords.

In this example, I’m going to focus on “buy backlinks for SEO”.

keyword target

Once you’ve selected a target keyword, benchmark its current CTR. I recommend adding this data to an annotation in Google Analytics.

Keep this open because you’ll be adding whatever changes you made to it as well.

Annotate in Google Analytics

Now you need to examine the SERPs for that keyword. The first thing I notice in my situation are the ads.

SERP

Take note of the headlines.

The next thing to consider is my page that’s ranking #1.

Does the search intent for the keyword phrase “buy backlinks for SEO” match my page?

search engine result

I think it’s appropriate that it’s ranking because it is on topic.

However, someone searching “buy backlinks for SEO” seems to already have an objective in their mind. It seems that they already made the decision to “buy backlinks”.

That means they may not want to change their mind about buying links.

That could be a reason why the CTR for this page is suffering.

So, in this example, it doesn’t make much sense for me to change the strategy of my blog post.

This is a good reminder that being an SEO expert isn’t always about what you do. It’s also about what you don’t do.

Since that keyword phrase didn’t pan out, let’s take a look at the keyword phrase: “seo st louis”.

ctr example

From a quick SERP analysis, it’s easy to understand why the CTR is so poor for that page. There’s a Google Ad and the local pack is pushing the organic results below the fold.

SERP features

Now if I was serious about ranking for this keyword phrase, I would focus on the local pack. My page is ranking #2 in organic search and there’s a lot of room for improvement.

SERP features example

The first step to increasing your organic CTR is to improve your position. In this case, I would see a major boost in CTR by moving from the #2 to the #1 spot.

I recommend optimizing, improving, and adding more content. Then if it makes sense, try to acquire links to the page. While that’s happening you also want to try to improve your CTR.

I always look for a featured snippet because that’s an easy way to increase your CTR. You have to restructure and optimize your page for featured snippets.

Take note of a few elements in this example:

  • The #1 result has structured data and breadcrumbs showing
  • The #3 result has site links

That means that our page can also get those features. It makes sense to add reviews to our page that are using structured data.

Not only will these increase the organic CTR, but it will also add more unique user-generated content.

I’ve talked about a lot of technical optimization tactics. Now I want to show you how to optimize your title tags and meta descriptions for CTR.

You’ll need to put on your copywriting hat for this.

The first question is:

What does this searcher want the most when they search “seo st louis”?

This individual wants to work with a competent, trustworthy, and successful SEO company located in St. Louis. They likely want to meet face-to-face and shake your hand.

How do I know?

Because I’ve done exactly that with countless businesses in this area. How business owners operate in St. Louis compared to how they operate in New York City is different.

These nuances are huge. I won’t get into the psychology of midwest culture right now.

That said, your title and meta description should persuade your ideal customer.

So, if you’re targeting business owners in St. Louis, you need to clearly state why you’re the best option.

“Best St. Louis SEO Company” is a good start, but WHY is Gotch SEO the “best”?

Ask yourself:

What does a business owner in St. Louis value the most?

Some things they might value are:

  • Only working with a company that’s located in St. Louis
  • Working with a company that’s willing to meet in person and shake their hand
  • Working with a proven company with a visible track record of results
  • A sense of security that trying SEO again will work this time because it’s never worked in the past

List as many ideas as you can. Step into their shoes. Then, create at least 10 different headlines using these ideas.

Here are some examples:

  • #1 SEO Company Located in St. Louis (with Over 153 5-Star Reviews)
  • #1 St. Louis SEO Company (See Why 134 Other Companies Trust Us)
  • St. Louis SEO Company That’s Driven Over $12,031,231 for Clients
  • Most Trust SEO Company in St. Louis (139 Real 5-Star Reviews)
  • The Only ROI-Driven SEO Company in St. Louis
  • St. Louis SEO Company with Over 1,304,012 First Page Rankings
  • #1 Recommended St. Louis SEO Company (Insane Results for Clients)
  • St. Louis SEO Company – Get 112% More Traffic Like Our Clients
  • St. Louis SEO Company – Get 212% More Revenue Like Our Clients
  • St. Louis SEO Company – See Why 174 Others Trust Us

The combinations are endless. That’s why it’s critical that you test.

Add these same concepts to your meta description as well.

I recommend making your changes and then waiting at least a few weeks to see the results.

Make sure you annotate your changes in Google Analytics.

If you don’t see better performance, then iterate, and test again. There is no “end” to optimizing a website for organic search.

Perform Conversion Rate Optimization (CRO)

The third way to leverage Google Search Console data has nothing to do with SEO.

I recommend that you perform Conversion Rate Optimization (CRO) high-performing organic search pages.

Getting traffic is nice, but converting that traffic into leads and new customers are even better.

I won’t get into CRO here, but check out these resources:

One thing that you need to keep in mind is that every single page on your site should have a goal.

It doesn’t have to always be transactional either. In fact, trying to score a sale is a poor strategy because ~98% of website visitors are not ready to buy.

That’s why it’s fundamental that you convert a percentage of that traffic into email subs or get them on a retargeting list.

Track Branded Search Performance

The last way to leverage this data isn’t actually a tactic at all. I recommend monitoring your branded search performance.

Branded Search Performance

While ranking for informational keywords is critical for increasing traffic, branded search is what will keep you afloat when rankings fluctuate.

The question is:

How do you get more branded searches?

You need an all-encompassing marketing strategy outside of SEO.

In general, if you produce exceptional value and your products are excellent, then you’ll get branded searches.

I recommend using Google Search Console to track your branded search performance every month.

If it’s not growing, then you know you need to adjust your strategy.

Bonus Google Search Console Sections to Investigate

Another Google Search Console section you’ll want to investigate is “Enhancements”.

Enhancements section

The “Mobile Usability” section will show you issues impacting the mobile search user’s experience on your website. It’s important to fix anything that shows up here.

Mobile usability section

Think about this way:

If Google is dedicating a section to it, it’s likely an important factor for organic search performance.

The same logic applies to every section within Google Search Console.

The “Security & Manual Actions” section is one to visit if your organic search traffic falls.

security and manual actions

Both manual actions and security-related issues can wreck your traffic. Go to this section first if your traffic plummets.

No manual actions detected

The last section to examine is the “Links” section.

Links

I prefer using Ahrefs for all link analysis, but Google Search Console can give you some solid intel. It doesn’t give you all your link data.

However, it is a decent sample set. One thing to examine is the “Top linking text” section.

Top linking text

This is your external link anchor text profile. Ideally, your “Top linking text” should be branded.

You should also look at the “Internal links” section because it may indicate some inefficiencies with your site architecture.

Internal links section

For example, my “best link building services” page may not have as many crawler pathways as I would like.

Internal links example

The appropriate action would be to create more internal links that page, so it performs better.

The other way to use this section is if you get a manual or algorithmic penalty. In many cases, websites get penalized because of low-quality links and over-optimized anchor text.

How to Clean Up Your Link Profile Using Google Search Console

Click “Export External Links” and select “More sample links”.

Export links

Then copy 200 of these URLs and open up Ahrefs. Go to “More” in the navigation and click on “Batch analysis”.

Ahrefs Batch Analysis

Paste the URLs, click the dropdown under “Target mode”, select “domain with all its subdomains” and start the analysis.

Quick batch analysis ahrefs

Click “Export” and open the file.

Ahrefs batch analysis export

Delete every column except for “Target”, “Domain Rating”, “Ref domains Dofollow”, “Total Backlinks”, “Total Keywords”, and “Total Traffic”. Then copy the data and paste into the Google Sheet (or you can filter through it in the .csv).

backlink profile analysis

I would filter the links by “Domain Rating” and then manually go through each link.

You can categorize them as “Good, Okay, Bad”.

I won’t get into link analysis here, but I recommend reading this article about the best link building services and my backlinks guide.

These will both give you a framework for what a quality link looks like.

That’s a Wrap!

Google Search Console is a robust free SEO tool that cannot be overlooked. Take advantage of it and start increasing your organic search traffic.

Read More

AMP'd Up for Recaptcha

Beyond search Google controls the leading distributed ad network, the leading mobile OS, the leading web browser, the leading email client, the leading web analytics platform, the leading free video hosting site.

They win a lot.

And they take winnings from one market & leverage them into manipulating adjacent markets.

Embrace. Extend. Extinguish.

AMP is an utterly unnecessary invention designed to further shift power to Google while disenfranchising publishers. From the very start it had many issues with basic things like supporting JavaScript, double counting unique users (no reason to fix broken stats if they drive adoption!), not supporting third party ad networks, not showing publisher domain names, and just generally being a useless layer of sunk cost technical overhead that provides literally no real value.

Over time they have corrected some of these catastrophic deficiencies, but if it provided real value, they wouldn’t have needed to force adoption with preferential placement in their search results. They force the bundling because AMP sucks.

Absurdity knows no bounds. Googlers suggest: “AMP isn’t another “channel” or “format” that’s somehow not the web. It’s not a SEO thing. It’s not a replacement for HTML. It’s a web component framework that can power your whole site. … We, the AMP team, want AMP to become a natural choice for modern web development of content websites, and for you to choose AMP as framework because it genuinely makes you more productive.”

Meanwhile some newspapers have about a dozen employees who work on re-formatting content for AMP:

The AMP development team now keeps track of whether AMP traffic drops suddenly, which might indicate pages are invalid, and it can react quickly.

All this adds expense, though. There are setup, development and maintenance costs associated with AMP, mostly in the form of time. After implementing AMP, the Guardian realized the project needed dedicated staff, so it created an 11-person team that works on AMP and other aspects of the site, drawing mostly from existing staff.

Feeeeeel the productivity!

Some content types (particularly user generated content) can be unpredictable & circuitous. For many years forums websites would use keywords embedded in the search referral to highlight relevant parts of the page. Keyword (not provided) largely destroyed that & then it became a competitive feature for AMP: “If the Featured Snippet links to an AMP article, Google will sometimes automatically scroll users to that section and highlight the answer in orange.”

That would perhaps be a single area where AMP was more efficient than the alternative. But it is only so because Google destroyed the alternative by stripping keyword referrers from search queries.

The power dynamics of AMP are ugly:

“I see them as part of the effort to normalise the use of the AMP Carousel, which is an anti-competitive land-grab for the web by an organisation that seems to have an insatiable appetite for consuming the web, probably ultimately to it’s own detriment. … This enables Google to continue to exist after the destination site (eg the New York Times) has been navigated to. Essentially it flips the parent-child relationship to be the other way around. … As soon as a publisher blesses a piece of content by packaging it (they have to opt in to this, but see coercion below), they totally lose control of its distribution. … I’m not that smart, so it’s surely possible to figure out other ways of making a preload possible without cutting off the content creator from the people consuming their content. … The web is open and decentralised. We spend a lot of time valuing the first of these concepts, but almost none trying to defend the second. Google knows, perhaps better than anyone, how being in control of the user is the most monetisable position, and having the deepest pockets and the most powerful platform to do so, they have very successfully inserted themselves into my relationship with millions of other websites. … In AMP, the support for paywalls is based on a recommendation that the premium content be included in the source of the page regardless of the user’s authorisation state. … These policies demonstrate contempt for others’ right to freely operate their businesses.

After enough publishers adopted AMP Google was able to turn their mobile app’s homepage into an interactive news feed below the search box. And inside that news feed Google gets to distribute MOAR ads while 0% of the revenue from those ads find its way to the publishers whose content is used to make up the feed.

Appropriate appropriation. 😀

Thank you for your content!!!

The mainstream media is waking up to AMP being a trap, but their neck is already in it:

European and American tech, media and publishing companies, including some that originally embraced AMP, are complaining that the Google-backed technology, which loads article pages in the blink of an eye on smartphones, is cementing the search giant’s dominance on the mobile web.

Each additional layer of technical cruft is another cost center. Things that sound appealing at first blush may not be:

The way you verify your identity to Let’s Encrypt is the same as with other certificate authorities: you don’t really. You place a file somewhere on your website, and they access that file over plain HTTP to verify that you own the website. The one attack that signed certificates are meant to prevent is a man-in-the-middle attack. But if someone is able to perform a man-in-the-middle attack against your website, then he can intercept the certificate verification, too. In other words, Let’s Encrypt certificates don’t stop the one thing they’re supposed to stop. And, as always with the certificate authorities, a thousand murderous theocracies, advertising companies, and international spy organizations are allowed to impersonate you by design.

Anything that is easy to implement & widely marketed often has costs added to it in the future as the entity moves to monetize the service.

This is a private equity firm buying up multiple hosting control panels & then adjusting prices.

This is Google Maps drastically changing their API terms.

This is Facebook charging you for likes to build an audience, giving your competitors access to those likes as an addressable audience to advertise against, and then charging you once more to boost the reach of your posts.

This is Grubhub creating shadow websites on your behalf and charging you for every transaction created by the gravity of your brand.

Shivane believes GrubHub purchased her restaurant’s web domain to prevent her from building her own online presence. She also believes the company may have had a special interest in owning her name because she processes a high volume of orders. … it appears GrubHub has set up several generic, templated pages that look like real restaurant websites but in fact link only to GrubHub. These pages also display phone numbers that GrubHub controls. The calls are forwarded to the restaurant, but the platform records each one and charges the restaurant a commission fee for every order

Settling for the easiest option drives a lack of differentiation, embeds additional risk & once the dominant player has enough marketshare they’ll change the terms on you.

Small gains in short term margins for massive increases in fragility.

“Closed platforms increase the chunk size of competition & increase the cost of market entry, so people who have good ideas, it is a lot more expensive for their productivity to be monetized. They also don’t like standardization … it looks like rent seeking behaviors on top of friction” – Gabe Newell

The other big issue is platforms that run out of growth space in their core market may break integrations with adjacent service providers as each want to grow by eating the other’s market.

Those who look at SaaS business models through the eyes of a seasoned investor will better understand how markets are likely to change:

“I’d argue that many of today’s anointed tech “disruptors” are doing little in the way of true disruption. … When investors used to get excited about a SAAS company, they typically would be describing a hosted multi-tenant subscription-billed piece of software that was replacing a ‘legacy’ on-premise perpetual license solution in the same target market (i.e. ERP, HCM, CRM, etc.). Today, the terms SAAS and Cloud essentially describe the business models of every single public software company.

Most platform companies are initially required to operate at low margins in order to buy growth of their category & own their category. Then when they are valued on that, they quickly need to jump across to adjacent markets to grow into the valuation:

Twilio has no choice but to climb up the application stack. This is a company whose ‘disruption’ is essentially great API documentation and gangbuster SEO spend built on top of a highly commoditized telephony aggregation API. They have won by marketing to DevOps engineers. With all the hype around them, you’d think Twilio invented the telephony API, when in reality what they did was turn it into a product company. Nobody had thought of doing this let alone that this could turn into a $17 billion company because simply put the economics don’t work. And to be clear they still don’t. But Twilio’s genius CEO clearly gets this. If the market is going to value robocalls, emergency sms notifications, on-call pages, and carrier fee passed through related revenue growth in the same way it does ‘subscription’ revenue from Atlassian or ServiceNow, then take advantage of it while it lasts.

Large platforms offering temporary subsidies to ensure they dominate their categories & companies like SoftBank spraying capital across the markets is causing massive shifts in valuations:

I also think if you look closely at what is celebrated today as innovation you often find models built on hidden subsidies. … I’d argue the very distributed nature of microservices architecture and API-first product companies means addressable market sizes and unit economics assumptions should be even more carefully scrutinized. … How hard would it be to create an Alibaba today if someone like SoftBank was raining money into such a greenfield space? Excess capital would lead to destruction and likely subpar returns. If capital was the solution, the 1.5 trillion that went into telcos in late ’90s wouldn’t have led to a massive bust. Would a Netflix be what it is today if a SoftBank was pouring billions into streaming content startups right as the experiment was starting? Obviously not. Scarcity of capital is another often underappreciated part of the disruption equation. Knowing resources are finite leads to more robust models. … This convergence is starting to manifest itself in performance. Disney is up 30% over the last 12 months while Netflix is basically flat. This may not feel like a bubble sign to most investors, but from my standpoint, it’s a clear evidence of the fact that we are approaching a something has got to give moment for the way certain businesses are valued.”

Circling back to Google’s AMP, it has a cousin called Recaptcha.

Recaptcha is another AMP-like trojan horse:

According to tech statistics website Built With, more than 650,000 websites are already using reCaptcha v3; overall, there are at least 4.5 million websites use reCaptcha, including 25% of the top 10,000 sites. Google is also now testing an enterprise version of reCaptcha v3, where Google creates a customized reCaptcha for enterprises that are looking for more granular data about users’ risk levels to protect their site algorithms from malicious users and bots. … According to two security researchers who’ve studied reCaptcha, one of the ways that Google determines whether you’re a malicious user or not is whether you already have a Google cookie installed on your browser. … To make this risk-score system work accurately, website administrators are supposed to embed reCaptcha v3 code on all of the pages of their website, not just on forms or log-in pages.

About a month ago when logging into Bing Ads I saw recaptcha on the login page & couldn’t believe they’d give Google control at that access point. I think they got rid of that, but lots of companies are perhaps shooting themselves in the foot through a combination of over-reliance on Google infrastructure AND sloppy implementation

Today when making a purchase on Fiverr, after converting, I got some of this action

Hmm. Maybe I will enable JavaScript and try again.

Oooops.

That is called snatching defeat from the jaws of victory.

My account is many years old. My payment type on record has been used for years. I have ordered from the particular seller about a dozen times over the years. And suddenly because my web browser had JavaScript turned off I was deemed a security risk of some sort for making an utterly ordinary transaction I have already completed about a dozen times.

On AMP JavaScript was the devil. And on desktop not JavaScript was the devil.

Pro tip: Ecommerce websites that see substandard conversion rates from using Recaptcha can boost their overall ecommerce revenue by buying more Google AdWords ads.

As more of the infrastructure stack is driven by AI software there is going to be a very real opportunity for many people to become deplatformed across the web on an utterly arbitrary basis. That tech companies like Facebook also want to create digital currencies on top of the leverage they already have only makes the proposition that much scarier.

If the tech platforms host copies of our sites, process the transactions & even create their own currencies, how will we know what level of value they are adding versus what they are extracting?

Who measures the measurer?

And when the economics turn negative, what will we do if we are hooked into an ecosystem we can’t spend additional capital to get out of when things head south?

Categories: 

Read More

The Future of SEO – How Google Search Has Changed in 2019 & How It Affects You

Even though the year is not over yet, there have been some big changes in Google search in 2019 that have already impacted on the results and the user, along with all the SEO agencies and companies. 

 

The future of search looks like it will have actually less search, as weird as that might sound. Google performed lots of changes in SERP, lots of algorithms and features were added and updated, which make things even more complex. While in some countries Google search lacks in optimization, other countries experience major fluctuations. 

 

How Google Search changed in 2019

 

That being said, we’ve searched through our memories and the web to understand how Google has changed in 2019 so far and here are the top highlights:

 

  1. Pick up Where You Left off on Search
  2. Confirmed Broad Core Update in March 
  3. New Way to Explore Information with Google Discover
  4. Enriching Search Results Through Structured Data
  5. Mobile-First Indexing by Default for New Domains
  6. The June 2019 Core Update
  7. The Site Diversity Update Roll Out
  8. Navigate the Search More Easily and Safer

 

Last year alone, Google performed 3,200 changes to their system, including features and regular updates meant to keep the results relevant. Moreover, they say evolution is the key to face the changes. 

 

“Sometimes the web just evolved. Sometimes what users expect evolves and similarly, sometimes our algorithms are, the way that we try to determine relevance, they evolve as well.”
John Mueller SEO John Mueller
Webmaster Trends Analyst at Google
 

1.  Pick up Where You Left off on Search

 

Right at the beginning of the year, Google added a new feature that helped users who perform mobile searches and launched activity cards to help users pick up where they left off. For example, if you look for search queries such as clean eating receipts or clean eating diets and you are logged into your Google account, you’ll be able to find an activity card at the top of the results page. That way, it provides easy access and it is a handful way to continue exploring and read your previous searches. 

 

You’ll only be able to see the pages you’ve visited, along with the searches you’ve made. You can either click on the pages that you viewed before or perform another search to discover other aspects to that topic. That way, you can pick up where you left off on Google search.

 

Moreover, you can bookmark a specific receipt and add it to your collection for future reference. Your collection can be accessed by looking in your menu on the top left of the Search page or through the bottom bar of Google app. You can have as many collections as you want. 

 

Save Google search in your account

 

Easily edit your search activity and card by deleting pages or turning them off by tapping the three-dot icon. 

 

The change performed by Google using this method was desired to make search history more accessible and more useful, help users follow their interests, build new habits, keep up with their tasks and keep valuable information on sight. 

 

2. Confirmed Broad Core Update in March 

 

The first confirmed update of this year from Google happened on March 13. Initially, the name was Florida 2.0, but Google themselves named this update Google March 2019 Core Update. The unlucky 13 came on a Wednesday. 

 

The updates could be spotted using cognitiveSEO signals, by looking at the chart. In case there are big fluctuations, marked with red bars, that means there are possible updates.  Below you can see a printscreen from the tool how the core update looked right on that day.

 

SERP fluctuations March 13 cognitiveSEO signals

 

 

Google made it official on Twitter, where you can see the whole thread and the discussions on the topic: 

 

 

While Google performs lots of focused updates nearly daily, there are some broad algorithm updates that happen several times per year. This update was about quality, designed to improve the overall results with no other specific details. 

 

For every site that was impacted by that, there’s nothing they can do to fix it. Apparently, that’s what Google SearchLiaison said through a tweet. Building great content should be everyone’s mantra. Link building, social media marketing, online presence also matters in the SEO industry; so things remain the same. 

 

There’s no “fix” for pages that may perform less well other than to remain focused on building great content. Over time, it may be that your content may rise relative to other pages.
Google logo Google SearchLiaison
 

 

Since content is mentioned, most probably it was a content related update, focusing on the quality it provides for the user. To stay ahead of competitors you should always search for the opportunity to deliver great content that becomes viral. Look for trends and evergreen content. With this in mind, ranking in Google will look like a gamble. Personalization and localization will become crucial and your focus will change upon that.

 

3. New Way to Explore Information with Google Discover

 

Named Google Feed when it was first released and later Google Discover a few months ago, the tool was planned to surface relevant content to the users, even when they’re not searching. The product is used by more than 800 million people each month to stay up to date. 

 

Google Discover works very smoothly based on preferences, by following certain topics, customer’s behavior and search history. The tool recommends your content related to anything you might be interested such as sports, TV shows, actors, public personalities, brands, flights, travel, weather info and may more.

 

feed follow for blog newest

 

Google Discover can have major implications on how the user will interact with search and web. Since a user can personalize its search and select only the things that interest them, the Google Search itself might lose some of the popularity. With that in mind, a business will not compete to be in the first 10 positions to win the user’s heart, but rather in the top 1-3 (at max) because that is the number of results a user sees on a mobile screen when performing a search.

 

So, when a user searches for a topic, Google looks at the pattern for each user and tries to understand the subtopics related to each other. Based on the data collected, Google Discover will surface relevant topic that might interest the user next, predicting their future search. 

 

Discover is one step ahead. 

 

Discover has a lot of potential and it might change user behavior regarding online searching and information consumption.

 

4. Enriching Search Results Through Structured Data & Search Console

 

Google has always encouraged websites to add markup to their content for the search engines to understand better what that’s about. Implementing structured data on your website will probably offer you the chance to receive an enhanced appearance in Google search results. 

 

In 2019, Google is pushing website owners to use more structured data for highlighting all sort of content and achieve more goals through a lot of great new features:

  • Increase brand awareness: for logo, local business, and sitelinks search box.
  • Markup your content for more traffic: for articles, breadcrumbs, events, jobs, Q&A, recipes, reviews.
  • Highlight products on SERP for conversions: for price, availability, and review ratings.

 

Which such a big range of content that you can use structured data for, the SERP has definitely changed. The newest additions to the list are structured data for FAQ and “How-to” type of content. For example, “how-to” content can be highlighted, by adding structured data to the steps, tools, duration, and other properties. Take a look at the example below:

 

Ho to make slime rich snippets

 

 There is detailed how-to documentation and FAQ provided by Google to help you to perform a correct setup. Explore Google’s search gallery and look at the right documentation for each element. 

 

Moreover, you can build how-to guides actions and FAQ answers with markup for Google Assistant, which means that how-to & FAQ answers can also be surfaced on the Google Assistant. Take a look at the example below:

 

FAQ-Google Assistant

 

All the structured data can be supported both on Google Search and Google Assistant and in Search Console, where you can see reports and monitor your website’s performance. 

 

In case there are some errors and Google Search Console detects some issues you’ll see them on the Enhancements report. 

 

Enhancements report for Structured data issues

 

The Structured data reporting from GSC has lots of advantages and allows you to review errors, warnings and valid items, plus the pages associated with such issues.

 

5. Mobile-First Indexing by Default for New Domains

 

This year, at the end of May, Google announced that as of June 1, 2019 mobile-first indexing was enabled by default for all new websites. On that day, it became official that websites should present the same content to users and search engines for both mobile and desktop devices. 

 

At the time we even wrote a 5 steps guideline to help website upgrade their mobile search engine optimization efforts and be prepared for the change. 

 

Mobile-first indexing notification

 

You can check any link from your website in Google Search Console using the URL Inspection Tool in Search Console. Doing so, you’ll see how it was crawled and indexed.

 

Googlebot smartphone

 

With this Mobile-First Indexing launching, Google search is changing, showing the same content on mobile and desktop, forcing sites to comply for them to keep their rankings (for both mobile and desktop). For newly designed websites Google is determining them to check factors such as parity of content (including text, images, videos, links), structured data, and other meta-data (such as titles and descriptions, robots meta tags) to comply with mobile-first indexing. 

 

As mentioned before, Google recommends websites to be mobile friendly and keeps encouraging responsive web design. Since mobile-first indexing has been running websites, should use a single URL for both desktop and mobile websites.

 

6. The June 2019 Core Update

 

Google announced on Twitter that a core update was rolling out on June 3, 2019. Similar to the update from March, it is part of the several broad core algorithm updates.

 

 

While Google hasn’t made any official statements on the following update, John Mueller said that core update encompasses a broader range of ranking factors. As he said, that there’s nothing specific to fix, lots of websites experienced ranking drops. You can hear what he specifically said in the Webmaster Central office-hours hangout when a user asked him about the update:

 

I think it’s a bit tricky because we’re not focusing on something very specific where we’d say like for example when we rolled out the speed update.

(…)Sometimes the web just evolved. Sometimes what users expect evolves and similarly, sometimes our algorithms are, the way that we try to determine relevance, they evolve as well.

John Mueller SEO John Mueller
Webmaster Trends Analyst at Google

 

He reminds about a blog post written by Amit Singhal, formerly the head of Google’s Search team for 15 years, about the quality of the website which can really help the users in situations like these. 

 

To understand a little better the situation, use cognitiveSEO signals and look at the fluctuations. Below is a printscreen with the evolution of search when the Google update was released. 

 

cognitiveSEO signals June Algortihm update devices

 

I looked for rankings changes for sites that had ranking changes by at least 3 positions on desktop, mobile and local rankings for the first page of Google Search Engine for the US market. cognitiveSEO signals monitor over 170.000 randomly selected keywords for both desktop, mobile and local rankings to offer valuable insights of what happens in Google so you can spot significant fluctuations.

 

7. The Site Diversity Update Roll Out

 

Right after the June 2019 Core Update, Google rolled out the Site Diversity update which is completely different and completely unrelated. This update was created in the interest of the user so they won’t see more than two listings from the same site in the top results. In the screenshot below you can see the top search results for “create dental website” where www.pbhs.com has two search results. 

 

Website that dominates SERP

 

Google explains that there might be situations where you could see more than two search results in Google when their systems determine that it is relevant to do so for a particular search. Whether it is content, website speed, Rank Brain, conversion rate optimization or any other factor, Google’s algorithm will decide what to show. The good thing is that it is improving and refined in order to offer more relevant results. 

 

 

It will be a while maybe until this update works wanders because there are still people who see more than two results from a website in Google search engine’s top results, as you can see if you look people’s tweets. 

 

Danny Sullivan, Google’s public Search Liaison, says will affect only main listings though:

 

 

8. Navigate the Search More Easily and Safer

 

Continuing with diversity in SERP, Google wants to show more types of content including featured snippets with the results that might interest you, answers box, Knowledge Panels which can help you find key information or predict your searches using Autocomplete. 

 

Their search results page doesn’t look like what we used to know, with a list of blue links and some ads. Now it is more personalized, offering information from lots of sources in all sorts of form: video content, visual content and text to connect you with useful information as quickly as possible.

 

If you look at the example below, you have lots of information above the fold allowing you to perform as fewest actions as possible and see everything with one click or two. 

 

Stranger things search results

 

Takeaway

 

Ask for a new pair of eyes to look at your website to review it and find out new ways of improvement. Moreover, look together at who is searching through your website to find out what they are looking for, the flow. Evaluate user experience by looking at the user’s behavior. All these insights are a good exercise for you to understand the user better. 

 

If the user is changing, their behavior will change, and therefore search will change. In this equation, your website should be changed -> improved. The search engine optimization role in the future of search gets even more complicated. Google for sure is shaping the future of SEO. 

The post The Future of SEO – How Google Search Has Changed in 2019 & How It Affects You appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.

Read More

Empathetic Consulting: 3 Things to Remember When Working With Other Teams

Posted by LaurelTaylor

Whether you consult with teams within your company or with outside clients, the chances are fairly high that at least once, you’ve left a meeting frustrated by the actions of others, even asking yourself: “why would they do that?”

It’s easy to walk into a project thinking of it as a simple matter of “they brought me in to fix a problem.” But the reality is rarely so simple. Consulting with other teams always entails organizational and emotional nuance that you may not be privy to.

Every interpersonal relationship is unique, and hopefully the circumstances I’m discussing won’t apply to many engagements or projects you take part in. However, when you do end up in a difficult consulting situation, it’s helpful to have a bit of empathy for those you’re working with.

I’ve found that remembering these 3 points can help me put myself in the shoes of my point of contact and interact with them in a way that is sensitive to what they may be dealing within their environment.

1. Your point of contact may not have asked for your help

It is entirely possible that the person you are trying to help may not want to be helped.

Management has its own ideas sometimes and internal communication isn’t always perfect at any company. This can lead to situations where your point of contact may feel defensive, especially if their job functions seem like they might cover what you are consulting on. The best intentions of a manager who wants to help by bringing in more resources may look like distrust or undermining to the employee who didn’t get a say.

At one point during my stint as an in-house SEO, I actually found myself in this exact position. Leadership brought in an outside agency to help with SEO during a domain migration, and while their intentions may have been to provide more help, they didn’t effectively communicate that to me.

As a result, since I was the one who was responsible for that area, it made me feel insecure about how management viewed me and my skills. I was lucky enough to work with a great consultant who was able to support me and help move forward the many projects that were already in-flight. But because I initially felt like they were undermining my credibility by being involved in the first place, it took a while to build that trust and be able to get things done effectively.

The best way to deal with this potential issue is to ensure that you respect the context and institutional knowledge that the team you are helping possesses. Work to have a collaborative relationship instead of an authoritative one. The more context and communication you have, the better the recommendations you can contribute.

2. If they did ask for help, they may be feeling vulnerable or insecure

Step back for a second and think about why a team might bring in an outside consultant, to begin with. There are tons of specific issues they could need assistance with, but all of this boils down to a problem that they presumably want or need help to solve — a problem that they couldn’t solve on their own. Regardless of whether they couldn’t solve it because of knowledge, resources, or even office politics, your contributions add something that they couldn’t contribute themselves — and that can be hard to deal with.

This isn’t something that needs to be discussed with the client or another team, but it is something that you should acknowledge and keep front-of-mind when you communicate with them. Respect the vulnerability of seeking out help, and appreciate the trust that they have placed in you.

3. Your client is accountable for the results of their project

When planning a long-term strategy, making tactical recommendations, or accessing the results of a marketing campaign that you helped execute, it’s easy to feel invested or accountable for the results of a project. However, it’s important to remember that your point of contact is usually far more accountable for results than you are. Their job, success, and emotions are all on the line much more than yours.

As an outside subject matter expert, your job is to give them all the information and resources to make the best decision. At the end of the day, the choice is theirs. I know how hard it can be to see your recommendations or projects rejected, but it’s important to try not to take it personally if they, having all the facts, make what they believe to be the best decision.

If they seem like they are questioning everything you say, maybe it’s because they want to be 100 percent sure it’s the best approach. Perhaps their micromanaging comes from a place of good intentions — just wanting to follow through and get the best outcome with every aspect of a project. Even what can come off as argumentative or difficult could be them playing devils advocate to ensure that everything has been considered.

Wrapping up

All this being said, perhaps none of these circumstances apply to the client that you are finding it hard to work with. People can have bad days, hard years, or even just generally prickly dispositions. But more empathy and compassion in the world is never a bad thing. So, I would encourage anyone who works with other teams to avoid the impulse to judge a harsh response, and instead, consider what may be behind it.

Have you ever been faced with a complicated consulting situation? Share what helped you navigate it in the comments below!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Read More

Guide to call tracking and the power of AI for analyzing phone data

Invoca, an AI-powered call tracking platform, published their Call Tracking Study Guide in March of this year. The in-depth guide demystifies call tracking technology and reviews how call tracking tools help marketers connect digital campaign data to inbound customer phone calls.

Call tracking is a powerful way for marketers to understand exactly where phone calls are coming from with granularity that, for the most robust tools, can extend down to the keyword level. This data helps reveal what platforms, publishers, keywords, and channels drive high-intent customers to call and can help marketers create a more informed media allocation strategy. 

Content produced in collaboration with Invoca.

Call tracking 101: A brief introduction

Invoca uses a snippet of JavaScript code placed on your website to track calls. After the code snippet is placed on the landing page, it swaps out your standard business phone number with a trackable, dynamic phone number which is unique to each website visitor. 

The tag also captures various referrer elements such as utm source, medium, paid search keyword and Google click ID—this is what enables Invoca to connect user data to phone calls.

Example of dynamic tracking phone numbers on a landing page

Example of dynamic tracking phone numbers on a landing page—source: Invoca

When the tracking number is called, the platform can also route the caller to the appropriate person or call center depending on what marketing content they are viewing, reducing time on hold and call transfers. Data is collected based on the specific call number which can include caller information, keyword, referrer type (e.g., banner ad, search ad, or social media ad) and referral source (e.g., Google, Facebook, etc.) which can also be used to inform the call center and create a highly personalized experience for the caller.

Example of referral data info in Invoca

Example of referral data info in Invoca

Not all call tracking tools are created equal

There is a large selection of call tracking tools on the market that range from basic to advanced in terms of features and functionality. 

Basic tools provide limited data to marketers, but they ignore the larger customer journey and tend to focus on last-touch attribution (e.g., making it difficult or impossible to determine where the call came from).

Some metrics a basic tool might track include:

  • Call volume
  • Call time and duration
  • Caller information
  • Basic campaign attribution

These tools provide some sense of campaign performance, but fail to tell the full story that can be gleaned when connecting analytics platforms (e.g. Google Analytics) to call information. 

More advanced AI-powered call tracking tools like Invoca aim to bridge that gap, while also automating some marketing actions after the call takes place. 

Advanced capabilities that AI-powered call tracking tools provide include:

  • Touchpoint attribution—Tie a call back to its source such e.g. paid search or social
  • Data unification—Integrate with multiple online (and offline) sources such as CRM tools
  • Data analysis—Use AI to analyze phone conversations and provide insight on call drivers, behaviors and outcomes
  • Marketing integration—Push data to the marketing stack for automation, optimization, analysis and more

The end result—and key benefit—of implementing an advanced call tracking tool is to gain valuable insight about campaign performance and attribution. 

Call tracking 201: AI and machine learning 

Martech companies are increasingly powering their technology with AI-driven platforms. AI enables marketers to gain intelligence quickly and make better-informed decisions. This trend bridges multiple industries, as shown in the graphic below. 

Companies that utilize or provide AI technology for enterprises

Companies that utilize or provide AI technology—source: TOPBOTS

Invoca uses Signal AI to help measure and attribute online conversions by mining data from the phone conversations themselves, freeing up valuable time for marketers who no longer have to listen to every call.

Signal AI uses AI to detect intent and patterns in language to provide actionable insights and conversion data (sale made, appointment set, etc.) for marketers. This is accomplished through a series of steps that start with the recorded conversation, transcribing the call into text which can then be analyzed by an algorithm, identifying key patterns, phrases, and actions, and pushing these insights to your marketing stack. Here’s a visual of what that looks like. Note that Invoca does not save call transcripts and is HIPAA and PCI compliant, an important distinction for marketers concerned with data privacy.

Image source: Invoca

Signal AI uses machine learning, an application of AI, which gives machines access to the data so that they can learn from it. AI works in conjunction with machine learning to provide actionable and accessible data to marketers—but marketers still need to review this data and make decisions based on their own observations and conclusions.

Invoca offers two versions of Signal AI to their call tracking clients. Pre-trained AI uses industry-based predictive models that have been “pre-trained” using thousands of hours of call data.

Custom AI is more appropriate for certain businesses, such as those with high volumes of calls or sophisticated data needs. This more complex option takes longer to create and implement, however, it can help certain businesses predict call outcomes with a higher degree of accuracy.

Debunking some common assumptions 

Skeptics may think that humans can classify calls more efficiently and accurately than AI, but the truth is the opposite. AI learns over time and it never gets tired, so it’s an effective and accurate way to classify calls without bias. Here are some other call tracking myths, debunked:

  • It’s hard to set up AI-based call tracking—Pre-trained AI models take the guesswork out of setup for certain industries such as insurance and can identify the most common outcomes (e.g., product purchased).
  • All AI-based call tracking is the same—False! Invoca’s Signal AI uses predictive analytics (rather than just transcription) and continues to learn. It also provides performance scoring for easy reference.
  • Only big companies can afford AI-based call tracking—Wrong again. Invoca is tag-based and easy to implement. You don’t need a dedicated IT team or programmer to get up and running.

Clear strategy and clean data

The true power of AI-based call tracking is, in a word, attribution. It’s the ability to unify call data across multiple sources and attribute it to all consumer touchpoints.

Invoca does this by collecting data from multiple sources: campaign and website data, first-party data (e.g., pulled from your CRM), third-party demographic data, call data such as length, time and location of call, and conversational data (derived from speech analysis).

Once all the available data is unified, Invoca’s technology determines the value of the call by analyzing the spoken conversations within the calls. Invoca’s AI synthesizes various word patterns (e.g., “I’m almost ready to buy, but I’m waiting for XYZ to happen”) and then classifies them into useful datasets.

Signal AI helps predict the type of call (e.g., sales, service, complaint) which allows marketers to optimize media placements, ad content, and more. This level of analysis can also help inform the call experience itself by identifying issues that may frustrate callers.

Connecting call data to campaign data can help in other ways too. For example, marketers can use call information for ad suppression, making sure customers don’t see offers for something they’ve already purchased or retargeting ads to people who called but didn’t make a purchase.

Tying it all together

One of the most powerful features of the more robust, high-end call-tracking tools like Invoca is the ability for them to integrate with existing marketing platforms like Google Analytics, Adobe Experience Cloud, and Salesforce. 

This gives marketers a clear picture of where their customers are at every step of the journey. It closes the attribution loop, allowing you to demonstrate what’s working from an ROI standpoint, a metric that’s key when it comes time for approval and budget allocation.

When considering implementing a tool like Invoca, the bottom line is always the top priority—will we make money with this martech investment?

Invoca customers have seen up to 60% increase in conversions when implementing the tool (without any additional media spend), an important consideration when factoring in ROI.

The Invoca Call Tracking Guide covers all this including what questions to ask vendors when considering a new tool and what to consider when shopping for a call tracking solution.

To learn more about call tracking technology from functionality to  implementation and how call tracking can help with campaign optimization and attribution, download Invoca’s whitepaper, “The Call Tracking Study Guide for Marketers.”

The post Guide to call tracking and the power of AI for analyzing phone data appeared first on Search Engine Watch.

Read More

Skip to content