SEO Articles

Apple Search

Apple Search

Google, Google, Google

For well over a decade Google has dominated search to where most stories in the search sphere were about Google or something on the periphery.

In 2019 Google generated $134.81 billion in ad revenues.

When Verizon bought core Yahoo three years ago the final purchase price was $4.48 billion. That amount was to own their finance vertical, news vertical, web portal, homepage, email & web search. It also included a variety of other services like Tumblr.

Part of what keeps Google so dominant in search is their brand awareness. That is also augmented by distribution as defaults in Chrome and Android. Then when it comes to buying search distribution from other players like Mozilla Firefox, Opera or Apple’s Safari they can outbid everyone else as they are much better at monetizing tier 2 markets and emerging markets than other search companies are since they have such strong ad depth. Even if Bing gave a 100% revshare to Apple they still could not compete with Google in most markets in terms of search monetization.

Apple as a Huge Search Traffic Driver

In 2019 Google paid just under £1.2 billion in default payments for UK search traffic. Most of that went to Apple. Historically when Google broke out their search revenues by region typically the US was around 45% to 46% of search ad revenue & the UK was around 11% to 12%, so it is likely Google is spending north of $10 billion a year to be the default search provider on Apple devices:

Apple submitted that search engines do not pay Apple for the right to be set as the primary default search engine on its devices. However, our assessment is that Google does pay to be the primary default on Apple devices. The agreement between Google and Apple states that Google will be the default web search provider and the same agreement states that Google will pay Apple a specified share of search advertising revenues. We also note that Google does not pay compensation to any partners that set Google Search as a secondary option. This further suggests that Google’s payment to Apple is in return for Apple setting Google as the primary default.

Apple is glad to cash those checks & let Google handle the core algorithmic search function in the web browser, but Apple also auto-completes many searches from within the address bar via various features like website history, top hit, news, Siri suggested website, suggested sites, etc.

A Unique Voice in Search

The nice thing about Apple powering some of those search auto-complete results themselves is their results are not simply a re-hash of the Google search results so they can add a unique voice to the search marketplace where if your site isn’t doing as well in Google it could still be promoted by Apple based on other factors.

High-traffic Shortcuts

Apple users generally have plenty of disposable personal income and a tendency to dispose of much of it, so if you are an Android user it is probably worth having an Apple device to see what they are recommending for core terms in your client’s markets. If you want to see recommendations for a particular country you may need to have a specialized router targeted to that country or use a web proxy or VPN.

Most users likely conduct full search queries and click through to listings from the Google search result page, but over time the search autocomplete feature that recommends previously viewed websites and other sites likely picks up incremental share of voice.

A friend of mine from the UK runs a local site and the following shows how the Apple ecosystem drove nearly 2/3 of his website traffic.

His website is only a couple years old, so it doesn’t get a ton of traffic from other sources yet. As of now his site does not have great Google rankings, but even if it did the boost by the Apple recommendations still provides a tailwind of free distribution and awareness (for however long it lasts).

For topics covered in news or repeat navigational searches Apple likely sends a lot of direct visits via their URL auto-completion features, but they do not use the feature broadly into the tail of search across other verticals, so it is a limited set of searches that ultimately benefit from the shortcuts.

Apple Search Ranking Factors

Apple recently updated their search page offering information about Applebot:

Apple Search may take the following into account when ranking web search results:

Aggregated user engagement with search results
Relevancy and matching of search terms to webpage topics and content
Number and quality of links from other pages on the web
User location based signals (approximate data)
Webpage design characteristics

Search results may use the above factors with no (pre-determined) importance of ranking. Users of Search are subject to the privacy policy in Siri Suggestions, Search & Privacy.

I have seen some country-code TLDs do well in their local markets in spite of not necessarily being associated with large brands. Sites which do not rank well in Google can still end up in the mix provided the user experience is clean, the site is useful and it is easy for Apple to associate the site with a related keyword.

Panda-like Quality Updates

Markets like news change every day as the news changes, but I think Apple also does some Panda-like updates roughly quarterly where they do a broad refresh of what they recommend generally. As part of those updates sites which were once recommended can end up seeing the recommendation go away (especially if user experience declined since the initial recommendation via an ad heavy layout or similar) while other sites that have good engagement metrics get recommended on related searches.

A friend had a website they sort of forgot that was recommended by Apple. That site saw a big jump on July 9, 2018 then it slid back in early August that year, likely after the testing data showed it wasn’t as good as some other site Apple recommended. They noticed the spike in traffic & improved the site a bit. In early October it was widely recommended once again. That lasted until May of 2019 when it fell off a cliff once more. They had monetized the site with a somewhat spammy ad network & the recommendation mostly went away.

The recommendations happen as the person types and they may be different for searches where there is a space between keywords and the word is ran together. It is also worth noting Apple will typically recommend the www. version of a site over the m. version of a site for sites that offer both, so it makes sense to ensure if you used separate URLs that the www version also uses a responsive website design.

Indirect Impact on Google

While the Apple search shortcuts bypass Google search & thus do not create direct user signals to impact Google search, people who own an iPhone then search on a Windows computer at work or a Windows laptop at home might remember the site they liked from their iPhone and search for it once more, giving the site some awareness that could indirectly bleed over into impacting Google’s search rankings.

Apple could also eventually roll out their own fully featured search engine.

Categories: other search engines

Read More

Crawl Budget Optimisation Through Log File Analysis

Crawl Budget Optimisation Through Log File Analysis

Log file analysis is one of those tasks you might not do often – due to data availability & time constraints – but that can provide insights you wouldn’t be able to discover otherwise, particularly for large sites. If you’ve never done a log analysis or are unsure what exactly to look for and where to start, I’ve built a guideline to help you:

Get started with some log file analysis tools
Understand what log files are useful for
Digging into the data and think how to better redistribute crawling resources

Log files are essentially a diary of all requests made to your site for a specific time period. The data is very specific and more in depth than you could gather from a crawl, Google Analytics and Google Search Console combined. By analysing this data you can quantify the size of any potential issue you discover and make better decisions on what to dig into even further. You can also discover issues things such as weird crawlers behavior which you could not identify through a regular tech audit. Log analysis is particularly valuable for large sites where a crawl would require an extensive amount of time and resources. 

Log file analysis tools

There are different tools out there for this task, Screaming FrogBotify and BigQuery to mention a few. At Distilled, we use BigQuery, which is quite flexible. A great place to get started if you’re not familiar with log analysis is the guideline Dom Woodman, senior consultant at Distilled, wrote on what a log file analysis is and how to do.

Regardless of the tool you choose to use, you should be able to use the framework below.

Understand what log files are useful for

Log files are a really good source for:

Discovering potential problems: use them to find things you can’t with a crawl since that doesn’t include historical memory of Google
Identify what to prioritise: knowing how often Google visits URLs can be a useful way of prioritising things.

The best part about log files is that they include all kinds of information you might want to know about, and more. Page response code? They have it. Page file type? Included. Crawler type? Should be in there. You get the idea. But until you slice your data in meaningful ways you won’t know what all this information is useful for. 

Digging into the data

When you begin to analyse logs, you should slice the information in big chunks to obtain a good overall picture of the data because it helps to understand what to prioritise.  You should always compare results to the number of organic sessions obtained because it helps to establish if crawling budget should be distributed differently.

These are the criteria I use to dig into the log file:

Top 10 URLs/paths most requested
200-code vs. non 200-code page
URLs with parameters vs non parameters
File type requests
Requests per subdomain 

Before you begin

At this stage, you should also decide on a threshold for what represents a significant percentage of your data. For example, if you discover that there are 20,000 requests with a 301 response code and the total number of requests on the logs are 2,000,000, then knowing that the 301s are only 1% of total requests helps you bucket this as a low priority issue. This might change by type, for example, 10% of category pages with a 404 status code might be more important than 10% of product pages with a 404 code.

Once you begin obtaining results from your data, you should consider whether the current crawler behavior is the best use of crawling resources. The answer to this question will tell you what the following actions should be.

Top 10 URLs/paths most requested vs organic sessions they drive

Through log file analysis you’ll often discover a few paths or specific URLs that had a significantly higher amount of requests compared to the rest. These usually happen to be URLs linked from most templates, for example from main nav or footer,  or from external sources but don’t often drive a high number of organic sessions.

Depending on what type of URLs these are, you may or may not need to take action. For example, if 40% of resources are used to request a specific URL, is that the best use of crawling resources or could they be better distributed?

Below is an example of the breakdown of top requested paths from log analysis and how they compare to organic sessions they drive:

Making this comparison on a graph allows you to easily identify how crawling resources could be better distributed. The first two blue bars show that the majority of requests are to two specific paths which drive no organic sessions. This is a quick way to identify important wins right away: in the example above, the next step would be to understand what those URLs are and where they are found to then decide whether they should be crawled and indexed or what additional action may be required. A tech audit would not give you the information I show in the graph. 

Page response code

Based on whether a high percentage of the log requests is a non-200 code page, you may want to dig into this area further. Here you should query your data to discover what is the break down of non-200 code page and based on results dig further, prioritising those with the highest percentage. 

Below is an example of non-200 code pages breakdown:

As visible above, almost 50% of all requests are to a non-200 status code page. In this case, investigate further into each status code to discover which type of pages they come from and what percentage each represents. As a side note, if you also encounter a large number of pages with a 304 status code, this is a server response essentially equivalent to a 200-status code. The 304 response indicates that the page has not changed since the previous transmission.

Here are some common checks you should do on non-200 code pages:

Are there patterns of internal links pointing to these pages? A crawl of the site would be able to answer this.
Is there a high number of external links/domains pointing to these pages?
Are any of these pages’ status code caused by certain actions/situations? (i.e. on ecommerce sites, discontinued products may become 404 pages or 301 redirects to main categories) 
Does the number of pages with a specific status code change over time?

URLs with parameters vs non-parameters

URLs with parameters can cause page duplication, in fact very often they are just a copy of the page without parameters, creating a large number of URLs that add no value to the site. In an ideal world, all URLs discovered by crawlers do not include parameters. However, this is not usually the case and a good amount of crawling resources are used to crawl parameterised  URLs. You should always check what percentage of total requests parameterised URLs make up for. 

Once you know the size of the issue, here are a few things to consider:

What is the page response code of these URLs?
How are parameterised URLs being discovered by crawlers?
Are there internal links to parameterised URLs?
What parameter keys are the most found and what are their purpose?

Depending on what you discover in this phase, there may be actions related to previous steps that apply here. 

File type requests

I always check the file type breakdown to quickly discover whether requests to resources such as images or JavaScript files make up a big portion. This should not be the case and in an ideal scenario as the highest percentage of requests should be for HTML type of pages because these are the pages Google not only understands but are also the pages you want to rank well. If you discover that crawlers are spending considerable resources for non-HTML files, then this is an area to dig into further. 

Here are a few important things to investigate:

Where are the resources discovered/linked from?
Do they need to be crawled or should they just be used to load the content?

As usual, you should bear in mind the most important question: is this the best use of crawling resources? If not, then consider blocking crawlers from accessing these resources with an indexing purpose. This can be easily done by blocking them on robots.txt, however, before you do you should always check with your dev.

Requests per subdomain

You may not need this step if you don’t have any subdomains, but otherwise, this is a check you should do to discover unusual behavior. Particularly, if you are analysing the logs of a specific domain, requests to other domains should be somewhat limited, depending on how your internal linking is organised. It also depends if Google sees the subdomains as your site rather than a separate subdomain.

As with the previous steps, this is the first breakdown of your data and based on the results it should tell you whether anything is worth digging further into or not. 

A few things to keep in mind in this section:

Should crawler spend less/more time on subdomains?
Where are the subdomain pages discovered within your site?

This could be another opportunity for redistributing crawling budget to the pages you want crawlers to discover.

To wrap it up

As with many SEO tasks, there are many different ways to go about a log analysis. The guideline I shared is meant to provide you with an organised method that helps you think about crawling budget resources and how to better use them. If you have any advice on how you think about crawling budget resources, please leave your advice in a comment below. 

Read More

How to duplicate a post in WordPress, plus 4 reasons why!

How to duplicate a post in WordPress, plus 4 reasons why!

If you write content in WordPress, duplicating a post can come in quite handy. It can save you a lot of valuable time to clone a post and adjust the content, instead of starting from scratch with every post you write. Fortunately, cloning a post becomes very easy with the Yoast Duplicate post plugin. In this article, you can read how to use it and we’ll discuss 4 everyday situations in which you might want to use it.

How to duplicate a post in WordPress

One of the newest additions to our Yoast stable is the Yoast Duplicate Post plugin. This simple but effective plugin helps your duplicate or clone a post in a few simple steps:

Install the Yoast Duplicate Post plugin

If you don’t have the plugin yet, simply go to Plugins in the backend of your WordPress site and install Yoast Duplicate Post. Not sure how to do this? This article explains how to install a plugin.

Click on Posts

After you’ve installed and activated the Yoast Duplicate Post plugin, you’re good to go. When you want to duplicate a post, go to your post overview where you’ll see all your posts listed. Find the post you want to clone:

Hover over the post you’d like to clone

If you hover your mouse over the post you’d like to clone, you’ll see some options appear under the post title:

Click on Clone post

When you want to duplicate your post, simply click “Clone” or “New Draft”. Both functions will clone your post. If you click on “New Draft” the clone will open directly so you can start working in it immediately. If you click “Clone” a duplicate of your post will appear as a draft post in the list:

Rename the clone

To prevent confusion, it’s best to rename your duplicate post right away. You can do this by clicking on the post and editing the title there. Or you can click on “Quick Edit” in the post overview and edit the title in this input field:

4 reasons to duplicate or clone a post

There are several reasons why you’d want to create a clone of an existing post. There might be more than 4 reasons, but here we’d like to highlight the reasons that are recognizable for most of us. Of course, you don’t want to publish the exact same or very similar content as that might confuse search engines. So, in what situations should you to use it?

1. Extensive updates on existing posts and pages

Keeping your content fresh and up to date is a sensible thing to do. You don’t want to show visitors outdated or incorrect information. Also, search engines prefer to serve users content that is regularly updated and accurate. Sometimes, updating is just a matter of changing a sentence here and there or fixing a typo, which you can easily do in an existing post. But if it needs more work, for instance, a complete rewrite of multiple paragraphs, you might want to work on this in a clone.

Working in a clone has a couple of advantages:

it allows you to adjust what’s needed, save it, re-read it, and correct it if necessary before your changes go live;you can preview your post and see exactly what it looks like; you can share the preview with others before you publish the changes.

When you’re sure the post is ready for publication, copy the content of the clone into the existing post and hit update. That way you’ll keep the old URL. If you do want to publish the clone instead, make sure to delete the old post and create a redirect!

2. Scheduled updates

In some cases, you don’t want to publish changes right away. You’ll have to wait until, for instance, a product is launched or an event took place. If you have a cloned post to work in, you can perfectly prepare the changes and just copy the content or push the post live (don’t forget to redirect!) when the time is there. This will save you a lot of last-minute work and editing.

3. Merges of multiple posts

Large sites often have lots of content. Inevitably, the content you publish might become more alike over time. We notice this ourselves as we write a lot about content optimization. Before you know it, you’ll have multiple posts on how to optimize a blog post. Is this a bad thing? Well, it might be, if you start competing with yourself in the search engines. We call this keyword cannibalization. We have a complete article on how to find and fix cannibilization in a smart way. 

If you have posts that are very similar and compete for a top ranking in the search results, you’re better off merging them into one complete and high-quality post. In order to do so, you can check how these similar posts are doing, which one gets the most traffic and ranks highest. This is the preferred post or URL to keep. 

When you take a closer look at the other post you might find interesting stuff in there that your high-performing post is missing. Then, of course, add it! This might be quite a puzzle though, and that’s where duplicating your post comes in. If you create a clone, you can take a good look at both posts, take the best out of both of them and merge them into one awesome and complete post. When you’re done, copy the content from the clone into the best-performing URL and don’t forget to redirect the post you’re not keeping! 

4. Reusing a format

Especially in eCommerce, you might have a certain format for a product page. But also, for a series of posts on your blog, help pages on your site, or events, you might like to stick to a certain format. If you’re using a format you’re happy with, you can use the clone function to duplicate the page with the right format. Delete the content you shouldn’t keep and just fill the post with the content about other products, help info, or events. It’s as easy as that, and a huge time saver.

What do you use it for?

Do you already use the Yoast Duplicate Post plugin? We’d like to know what situations you use it in! As we’re continuously improving the plugin, we love to hear how you use it and what features could be useful to add or improve. So please share your thoughts here!

Read more: about the Yoast Duplicate Post plugin »

The post How to duplicate a post in WordPress, plus 4 reasons why! appeared first on Yoast.

Read More

Update to Aggregate Lighthouse Reporter

Update to Aggregate Lighthouse Reporter

Hey everyone!

It’s been a hot minute since I posted so just wanted to quickly come in here and share an update we made to a free tool.

A while back we released a tool that would allow you to aggregate Lighthouse reports by template and visualize and report on assets across sites and at the template/page level in Google Data Studio. Sounds pretty cool right? You can read about it here and check out the GitHub repo here.

Shortly after that Google released their Core Web Vitals and included them in a release of Lighthouse. So here we are. We have updated our repo to include Lighthouse 6.0 (this is a copy of our production repo so it will automatically update.)

That means you can get all these beautiful visualizations of Core Web Vitals and a few other new things:


You can check out an example report to play around with here (sorry, not sorry Waste Management). And in case you missed it the first time the GitHub repo is at the big button below

Check it out on GitHub

The post Update to Aggregate Lighthouse Reporter appeared first on Local SEO Guide.

Read More

How to Increase Website Traffic by 1,070% (SEO Case Study)

Today I’m going to show you how to increase website traffic by 1,070%. This growth is from a B2C niche website we launched in 2018. Let’s dive right in. 1. Ignore Search Volume (Sometimes) There is an untapped and uncompetitive world for keywords with no search volume. You have to remember that search volume is …

Read moreHow to Increase Website Traffic by 1,070% (SEO Case Study)

Read More

Let’s Make Money: 4 Tactics for Agencies Looking to Succeed – Best of Whiteboard Friday

Let’s Make Money: 4 Tactics for Agencies Looking to Succeed – Best of Whiteboard Friday

Posted by rjonesx.

We spend a lot of time discussing SEO tactics, but in a constantly changing industry and especially in times of uncertainty, the strategies agencies should employ in order to see success deserve more attention. In this popular (and still relevant) Whiteboard Friday, Russ Jones discusses four essential success tactics that’ll ultimately increase your bottom line. 

Russ also delved into the topic of profitability in his MozCon Virtual presentation this year. To watch his and our other amazing speaker presentations, you can purchase access to the 2020 video bundle here.  

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans. I am Russ Jones, and I can’t tell you how excited I am for my first Whiteboard Friday. I am Principal Search Scientist here at Moz. But before coming to Moz, for the 10 years prior to that, I was the Chief Technology Officer of a small SEO agency back in North Carolina. So I have a strong passion for agencies and consultants who are on the ground doing the work, helping websites rank better and helping build businesses.

So what I wanted to do today was spend a little bit of time talking about the lessons that I learned at an agency that admittedly I only learned through trial and error. But before we even go further, I just wanted to thank the folks at Hive Digital who I learned so much from, Jeff and Jake and Malcolm and Ryan, because the team effort over time is what ended up building an agency. Any agency that succeeds knows that that’s part of it. So we’ll start with that thank-you.

But what I really want to get into is that we spend a lot of time talking about SEO tactics, but not really about how to succeed in an industry that changes rapidly, in which there’s almost no certification, and where it can be difficult to explain to customers exactly how they’re going to be successful with what you offer. So what I’m going to do is break down four really important rules that I learned over the course of that 10 years. We’re going to go through each one of them as quickly as possible, but at the same time, hopefully you’ll walk away with some good ideas. Some of these are ones that it might at first feel a little bit awkward, but just follow me.

1. Raise prices

The first rule, number one in Let’s Make Money is raise your prices. Now, I remember quite clearly two years in to my job at Hive Digital — it was called Virante then — and we were talking about raising prices. We were just looking at our customers, saying to ourselves, “There’s no way they can afford it.” But then luckily we had the foresight that there was more to raising prices than just charging your customers more.

How it benefits old customers

The first thing that just hit us automatically was… “Well, with our old customers, we can just discount them. It’s not that bad. We’re in the same place as we always were.” But then it occurred to us, “Wait, wait, wait. If we discount our customers, then we’re actually increasing our perceived value.” Our existing customers now think, “Hey, they’re actually selling something better that’s more expensive, but I’m getting a deal,” and by offering them that deal because of their loyalty, you engender more loyalty. So it can actually be good for old customers.

How it benefits new customers

Now, for new customers, once again, same sort of situation. You’ve increased the perceived value. So your customers who come to you think, “Oh, this company is professional. This company is willing to invest. This company is interested in providing the highest quality of services.” In reality, because you’ve raised prices, you can. You can spend more time and money on each customer and actually do a better job. The third part is, “What’s the worst that could happen?” If they say no, you offer them the discount. You’re back where you started. You’re in the same position that you were before.

How it benefits your workers

Now, here’s where it really matters — your employees, your workers. If you are offering bottom line prices, you can’t offer them raises, you can’t offer them training, you can’t hire them help, or you can’t get better workers. But if you do, if you raise prices, the whole ecosystem that is your agency will do better.

How it improves your resources

Finally, and most importantly, which we’ll talk a little bit more later, is that you can finally tool up. You can get the resources and capital that you need to actually succeed. I drew this kind of out.

If we have a graph of quality of services that you offer and the price that you sell at, most agencies think that they’re offering great quality at a little price, but the reality is you’re probably down here. You’re probably under-selling your services and, because of that, you can’t offer the best that you can.

You should be up here. You should be offering higher quality, your experts who spend time all day studying this, and raising prices allows you to do that.

2. Schedule

Now, raising prices is only part one. The second thing is discipline, and I am really horrible about this. The reality is that I’m the kind of guy who looks for the latest and greatest and just jumps into it, but schedule matters. As hard as it is to admit it, I learned this from the CPC folks because they know that they have to stay on top of it every day of the week.

Well, here’s something that we kind of came up with as I was leaving the company, and that was to set all of our customers as much as possible into a schedule.

Annually: we would handle keywords and competitors doing complete analysis.Semi-annually: Twice a year, we would do content analysis. What should you be writing about? What’s changed in your industry? What are different keywords that you might be able to target now given additional resources?Quarterly: You need to be looking at links. It’s just a big enough issue that you’ve got to look at it every couple of months, a complete link analysis.Monthly: You should be looking at your crawls. Moz will do that every week for you, but you should give your customers an idea, over the course of a month, what’s changed.Weekly: You should be doing rankings

But there are three things that, when you do all of these types of analysis, you need to keep in mind. Each one of them is a…

ReportHours for consultingPhone call

This might seem like a little bit of overkill. But of course, if one of these comes back and nothing changed, you don’t need to do the phone call, but each one of these represents additional money in your pocket and importantly better service for your customers.

It might seem hard to believe that when you go to a customer and you tell them, “Look, nothing’s changed,” that you’re actually giving them value, but the truth is that if you go to the dentist and he tells you, you don’t have a cavity, that’s good news. You shouldn’t say to yourself at the end of the day, “Why’d I go to the dentist in the first place?” You should say, “I’m so glad I went to the dentist.” By that same positive outlook, you should be selling to your customers over and over and over again, hoping to give them the clarity they need to succeed.

3. Tool up!

So number three, you’re going to see this a lot in my videos because I just love SEO tools, but you’ve got to tool up. Once you’ve raised prices and you’re making more money with your customers, you actually can. Tools are superpowers. Tools allow you to do things that humans just can’t do. Like I can’t figure out the link graph on my own. I need tools to do it. But tools can do so much more than just auditing existing clients. For example, they can give you…

Better leads:

You can use tools to find opportunities.Take for example the tools within Moz and you want to find other car dealerships in the area that are really good and have an opportunity to rank, but aren’t doing as well as they should be in SERPs. You want to do this because you’ve already serviced successfully a different car dealership. Well, tools like Moz can do that. You don’t just have to use Moz to help your clients. You can use them to help yourself.

Better pre-audits:

Nobody walks into a sales call blind. You know who the website is. So you just start with a great pre-audit.

Faster workflows:

Which means you make more money quicker. If you can do your keyword analysis annually in half the time because you have the right tool for it, then you’re going to make far more money and be able to serve more customers.

Bulk pricing:

This one is just mind-blowingly simple. It’s bulk pricing. Every tool out there, the more you buy from them, the lower the price is. I remember at my old company sitting down at one point and recognizing that every customer that came in the door would need to spend about $1,000 on individual accounts to match what they were getting through us by being able to take advantage of the bulk discounts that we were getting as an agency by buying these seats on behalf of all of our customers.

So tell your clients when you’re talking to them on the phone, in the pitch be like, “Look, we use Moz, Majestic, Ahrefs, SEMrush,” list off all of the competitors. “We do Screaming Frog.” Just name them all and say, “If you wanted to go out and just get the data yourself from these tools, it would cost you more than we’re actually charging you.” The tools can sell themselves. You are saving them money.

4. Just say NO

Now, the last section, real quickly, are the things you’ve just got to learn to say no to. One of them has a little nuance to it. There’s going to be some bite back in the comments, I’m pretty sure, but I want to be careful with it.

No month-to-month contracts

The first thing to say no to is month-to-month contracts.

If a customer comes to you and they say, “Look, we want to do SEO, but we want to be able to cancel every 30 days.” the reality is this. They’re not interested in investing in SEO. They’re interested in dabbling in SEO. They’re interested in experimenting with SEO. Well, that’s not going to succeed. It’s only going to take one competitor or two who actually invest in it to beat them out, and when they beat them out, you’re going to look bad and they’re going to cancel their account with you. So sit down with them and explain to them that it is a long-term strategy and it’s just not worth it to your company to bring on customers who aren’t interested in investing in SEO. Say it politely, but just turn it away.

Don’t turn anything away

Now, notice that my next thing is don’t turn anything away. So here’s something careful. Here’s the nuance. It’s really important to learn to fire clients who are bad for your business, where you’re losing money on them or they’re just impolite, but that doesn’t mean you have to turn them away. You just need to turn them in the right direction. That right direction might be tools themselves. You can say, “Look, you don’t really need our consulting hours. You should go use these tools.” Or you can turn them to other fledgling businesses, friends you have in the industry who might be struggling at this time.

I’ll tell you a quick example. We don’t have much time, but many, many years ago, we had a client that came to us. At our old company, we had a couple of rules about who we would work with. We chose not to work in the adult industry. But at the time, I had a friend in the industry. He lived outside of the United States, and he had fallen on hard times. He literally had his business taken away from him via a series of just really unscrupulous events. I picked up the phone and gave him a call. I didn’t turn away the customer. I turned them over to this individual.

That very next year, he had ended up landing a new job at the top of one of the largest gambling organizations in the world. Well, frankly, they weren’t on our list of people we couldn’t work with. We landed the largest contract in the history of our company at that time, and it set our company straight for an entire year. It was just because instead of turning away the client, we turned them to a different direction. So you’ve got to say no to turning away everybody. They are opportunities. They might not be your opportunity, but they’re someone’s.

No service creep

The last one is service creep. Oh, man, this one is hard. A customer comes up to you and they list off three things that you offer that they want, and then they say, “Oh, yeah, we need social media management.” Somebody else comes up to you, three things you want to offer, and they say, “Oh yeah, we need you to write content,” and that’s not something you do. You’ve just got to not do that. You’ve got to learn to shave off services that you can’t offer. Instead, turn them over to people who can do them and do them very well.

What you’re going to end up doing in your conversation, your sales pitch is, “Look, I’m going to be honest with you. We are great at some things, but this isn’t our cup of tea. We know someone who’s really great at it.” That honesty, that candidness is just going to give them such a better relationship with you, and it’s going to build a stronger relationship with those other specialty companies who are going to send business your way. So it’s really important to learn to say no to say no service creep.

Well, anyway, there’s a lot that we went over there. I hope it wasn’t too much too fast, but hopefully we can talk more about it in the comments. I look forward to seeing you there. Thanks.

Video transcription by Speechpad.com

Ready for more?

You’ll uncover even more SEO goodness in the MozCon 2020 video bundle. At this year’s special low price of $129, this is invaluable content you can access again and again throughout the year to inspire and ignite your SEO strategy:

21 full-length videos from some of the brightest minds in digital marketingInstant downloads and streaming to your computer, tablet, or mobile deviceDownloadable slide decks for presentations

Get my MozCon 2020 video bundle

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Read More