Blog

Using the SERP to build your keyword list

Posted by on Apr 30, 2019 in SEO Articles | Comments Off on Using the SERP to build your keyword list

Using the SERP to build your keyword list

Posted by TheMozTeam

This post was originally published on the STAT blog.

Keyword lists keeping you up at night? We feel you — and so does every other SEO. There’s a lot that goes into producing a robust keyword list and having one can make the difference between seeing the whole SERP landscape or getting just a glimpse.

Because we care about how much sleep you’re getting (a healthy eight hours, please), we whipped up a useful guide on our favourite way to keyword list-build, and all you need are three SERP features: the “People also ask” box, related searches, and the “People also search for” box.

We’ll explain why you should give these features a test drive and how you can get your hands on all their Google-vetted queries for the ultimate, competition-crushing keyword list.

Watch us turn 3,413 Nikon-related keywords into 25,349 without lifting a pinky finger.

Google-approved search terms 

Each of these features are keyword goldmines — all three of them link to new SERPs from terms that are semantically related to the searcher’s original query. As a result, they provide excellent insight into how users follow-up, narrow down, or refine their searches and reveal relevant topics that may be overlooked.

Google has put a lot of effort (and dollars) into understanding and mapping how topics and queries are linked, and these SERP features are the direct result of all that research — Google is literally pointing you to how and what everyone is searching. Which is why we dig them so much.

The “People Also Ask”

You’re probably quite familiar with this accordion-like feature. The “People also ask” box contains questions related to the searcher’s initial query, which then expand to reveal answers that Google has pulled from other websites.

Not only are PAA questions excellent long-tail additions to your keyword set, they’re also a great resource for content inspiration. The various ways that they express the same basic question can help you expand on topics — one piece of content could easily answer PAA questions such as “What a photographer needs to get started?” and “What tools do I need to be a photographer?”

Just try not to fall down the query rabbit hole. While the PAA box used to surface anywhere from one to four Q&A combos, most are “infinite” now and can easily multiply into the hundreds — giving you a seemingly endless supply of SERPs to track.

Just where are all these questions coming from, though? Are people actually asking them? If you read our previous write up on the PAA, you’ll know that Google is not always selecting these questions based on actual searched queries, as some return zero search volume when tracked.

If that wasn’t enough to raise our eyebrows, errant capitalization or non-capitalization (“how many mm are there in one Metre?”), wonky grammar (“Is aperture and f stop the same thing?”), and odd follow-up question choices (“how do you take a selfie?” for the query [easy to use digital camera]) suggest that many PAA questions are the result of machine learning.

In other words, Google is doing its darndest to understand actual search queries and spit out relevant subsequent searches to save users the effort. And it makes sense for us to be on those SERPs when searchers decide to take them up on the offer.

In order to capture all the goodies hiding in a PAA, we created a handy report. For each of your keywords that return a PAA box, our .CSV report will list the questions “also asked” (don’t worry, you’ll only get the number of PAAs that exist before things get infinitely overwhelming) and the URLs that Google sourced the answers from, plus the order they appear in.

After we ran the report for our Nikon queries, we found ourselves looking at 2,838 potential new keywords. A quick scan revealed that many of our PAA boxes returned the same questions over and over again (65.57 percent were duplicates), so we set about removing those. This narrowed our PAA keyword list down to 977 topically related queries to explore.

Related searches 

Another go-to for keyword inspiration are the eight related searches found at the bottom of the SERP that, when clicked, become the search query of a new SERP.

For instance, if we’re interested in ranking for “best professional cameras,” a quick look at the related searches will reveal alternative SERPs that Google thinks our searchers may be interested in, like “best professional camera for beginner,” “best dslr camera,” and “best point and shoot camera.” They help us understand how our searcher may refine or expand upon their original query.

Our related searches report makes it so that you don’t have to manually gather the “Searches related to” yourself — it takes them all and combines them into a crisp and clean .CSV spreadsheet.

This report surfaced 12,526 keywords for Nikon, and just like with our PAA suggestions, we noticed a bunch of repeat related search offenders. After trimming out the duplicates (55.09 percent), we were left with 5,626 unique keywords to help us flesh out our Nikon project.

The “People Also Search for” box 

The term “People also search for” (PASF) isn’t new to the SERP, the feature did get a major refresh back in February, which levelled things up.

Now, instead of just being attached to a knowledge graph, the PASF box also attaches itself to organic URLs and contains extra queries (up to eight on desktop; six on mobile) related to the URL that surfaces it. It’s Google’s way of saying, “Didn’t find what you’re looking for? We’ve got you — try these instead.”

This SERP feature requires you to do a little pogo-sticking in order to surface it — you need to click on the organic search result and then navigate back to the SERP before it materializes.

Obviously collecting these terms would involve a lot of work and potential finger cramps. Thankfully, there’s a handy hack to bypass all that, which is great if pogo-sticking isn’t your cup of tea. This lovely bit of JavaScript code originated from Carlos Canterello and reveals all the PASF boxes on a SERP without all the back and forth-ing.

Or, for those of you feeling DIY-y, you can pull all the raw HTML SERPs and parse them yourself — sans pogo stick, sans hack. Since we’re card-carrying data nerds, we opted for this route — we pulled the raw HTML SERPs through the STAT API and had ourselves a parsing party.

With upwards of eight PASF terms per organic result per SERP, we had oodles of keyword ideas on hand — a grand total of 59,284 to be exact (woah). Once we took away the duplicates, we were left with 18,746 unique keywords. That’s quite a drop from our original number — a whopping 68.38 percent of our keywords were repeats.

Keyword evaluation

Once our reports finished generating and we’d removed all those duplicates, we had 25,349 brand new keywords from all three features — that’s 642.71 percent more than what we started with.

While we trust Google to offer up excellent suggestions, we want to be sure we’ve got only the most relevant keywords to our project. To do this, we conducted a little keyword audit.

First, we combined all our queries into a master list and did some work to surface what was useful and remove the ones that, straight up, made zero sense, such as: “Russian ammo website,” “wallmart,” and “how to look beautiful in friends marriage,” which is super specific and very odd, but we applaud the level of dedication.

This removed 2,238 keywords from the mix, leaving us with a grand total of 23,111 keywords to creep on.

Satisfied with our brand spanking new list, we loaded those puppies into STAT to follow them around for a couple of days for further vetting.

Since we like it when things are Monica-level organized (and because smart segmentation will be key to making sense of all 23,111 of our keywords), we bagged and tagged our new queries into groups of the SERP features from whence they came so we can track which makes the best suggestions.

With our data hyper-organized, and with our search volume populated, we then selected keywords that returned no search volume and kicked them to the curb. You should do this too if you want to minimize clutter and focus on queries that will drive traffic.

We also decided to remove keywords with a search volume of less than 100. Just remember though: search volume is relative. Decide what constitutes as “low” for you — low search volume may be par for the course for your particular industry or vertical. You may just decide you want to keep low search volume keywords in your toolbox.

The rest is up to you 

Now that you know how to acquire boatloads of relevant keywords straight from Google’s billion-dollar consumer research project (the SERP), it’s time to figure out what your next steps are, which is entirely dependent on your SEO strategy.

Maybe you head straight to optimizing. Perhaps you want to do more vetting, like finding the keywords that surface certain SERP features.

If, for instance, we’re interested in featured snippets and local packs, we’d look to the SERP Features dashboard in STAT to see if any of our new keywords return these features, and then click to get those exact keywords. (We’ve even got a handy dandy write-up on exploring a SERP feature strategy to help get you started.)

Whatever adventure you choose, you’re now armed and ready with a crazy number of keywords, and it’s all thanks to your comprehensive list-building, courtesy of the SERP.

Want to learn how you can get cracking and tracking some more? Reach out to our rad team and request a demo to get your very own personalized walkthrough.

If you’re ready to dig in even deeper, check out how to build an intent-based keyword list to get next-level insight.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Restaurant Local SEO: The Google Characteristics of America’s Top-Ranked Eateries

Posted by on Apr 25, 2019 in SEO Articles | Comments Off on Restaurant Local SEO: The Google Characteristics of America’s Top-Ranked Eateries

Restaurant Local SEO: The Google Characteristics of America’s Top-Ranked Eateries

Posted by MiriamEllis

“A good chef has to be a manager, a businessman and a great cook. To marry all three together is sometimes difficult.”
– Wolfgang Puck

I like this quote. It makes me hear phones ringing at your local search marketing agency, with aspiring chefs and restaurateurs on the other end of the line, ready to bring experts aboard in the “sometimes difficult” quest for online visibility.

Is your team ready for these clients? How comfortable do you feel talking restaurant Local SEO when such calls come in? When was the last time you took a broad survey of what’s really ranking in this specialized industry?

Allow me to be your prep cook today, and I’ll dice up “best restaurant” local packs for major cities in all 50 US states. We’ll julienne Google Posts usage, rough chop DA, make chiffonade of reviews, owner responses, categories, and a host of other ingredients to determine which characteristics are shared by establishments winning this most superlative of local search phrases.

The finished dish should make us conversant with what it takes these days to be deemed “best” by diners and by Google, empowering your agency to answer those phones with all the breezy confidence of Julia Child.

Methodology

I looked at the 3 businesses in the local pack for “best restaurants (city)” in a major city in each of the 50 states, examining 11 elements for each entry, yielding 4,950 data points. I set aside the food processor for this one and did everything manually. I wanted to avoid the influence of proximity, so I didn’t search for any city in which I was physically located. The results, then, are what a traveler would see when searching for top restaurants in destination cities.

Restaurant results

Now, let’s look at each of the 11 data points together and see what we learn. Take a seat at the table!

Categories prove no barrier to entry

Which restaurant categories make up the dominant percentage of local pack entries for our search?

You might think that a business trying to rank locally for “best restaurants” would want to choose just “restaurant” as their primary Google category as a close match. Or, you might think that since we’re looking at best restaurants, something like “fine dining restaurants” or the historically popular “French restaurants” might top the charts.

Instead, what we’ve discovered is that restaurants of every category can make it into the top 3. Fifty-one percent of the ranking restaurants hailed from highly diverse categories, including Pacific Northwest Restaurant, Pacific Rim Restaurant, Organic, Southern, Polish, Lebanese, Eclectic and just about every imaginable designation. American Restaurant is winning out in bulk with 26 percent of the take, and an additional 7 percent for New American Restaurant. I find this an interesting commentary on the nation’s present gustatory aesthetic as it may indicate a shift away from what might be deemed fancy fare to familiar, homier plates.

Overall, though, we see the celebrated American “melting pot” perfectly represented when searchers seek the best restaurant in any given city. Your client’s food niche, however specialized, should prove no barrier to entry in the local packs.

High prices don’t automatically equal “best”

Do Google’s picks for “best restaurants” share a pricing structure?

It will cost you more than $1000 per head to dine at Urasawa, the nation’s most expensive eatery, and one study estimates that the average cost of a restaurant meal in the US is $12.75. When we look at the price attribute on Google listings, we find that the designation “best” is most common for establishments with charges that fall somewhere in between the economical and the extravagant.

Fifty-eight percent of the top ranked restaurants for our search have the $$ designation and another 25 percent have the $$$. We don’t know Google’s exact monetary value behind these symbols, but for context, a Taco Bell with its $1–$2 entrees would typically be marked as $, while the fabled French Laundry gets $$$$ with its $400–$500 plates. In our study, the cheapest and the costliest restaurants make up only a small percentage of what gets deemed “best.”

There isn’t much information out there about Google’s pricing designations, but it’s generally believed that they stem at least in part from the attribute questions Google sends to searchers. So, this element of your clients’ listings is likely to be influenced by subjective public sentiment. For instance, Californians’ conceptions of priciness may be quite different from North Dakotans’. Nevertheless, on the national average, mid-priced restaurants are most likely to be deemed “best.”

Of anecdotal interest: The only locale in which all 3 top-ranked restaurants were designated at $$$$ was NYC, while in Trenton, NJ, the #1 spot in the local pack belongs to Rozmaryn, serving Polish cuisine at $ prices. It’s interesting to consider how regional economics may contribute to expectations, and your smartest restaurant clients will carefully study what their local market can bear. Meanwhile, 7 of the 150 restaurants we surveyed had no pricing information at all, indicating that Google’s lack of adequate information about this element doesn’t bar an establishment from ranking.

Less than 5 stars is no reason to despair

Is perfection a prerequisite for “best”?

Negative reviews are the stuff of indigestion for restaurateurs, and I’m sincerely hoping this study will provide some welcome relief. The average star rating of the 150 “best” restaurants we surveyed is 4.5. Read that again: 4.5. And the number of perfect 5-star joints in our study? Exactly zero. Time for your agency to spend a moment doing deep breathing with clients.

The highest rating for any restaurant in our data set is 4.8, and only three establishments rated so highly. The lowest is sitting at 4.1. Every other business falls somewhere in-between. These ratings stem from customer reviews, and the 4.5 average proves that perfection is simply not necessary to be “best.”

Breaking down a single dining spot with 73 reviews, a 4.6 star rating was achieved with fifty-six 5-star reviews, four 4-star reviews, three 3-star reviews, two 2-star reviews, and three 1-star reviews. 23 percent of diners in this small review set had a less-than-ideal experience, but the restaurant is still achieving top rankings. Practically speaking for your clients, the odd night when the pho was gummy and the paella was burnt can be tossed onto the compost heap of forgivable mistakes.

Review counts matter, but differ significantly

How many reviews do the best restaurants have?

It’s folk wisdom that any business looking to win local rankings needs to compete on native Google review counts. I agree with that, but was struck by the great variation in review counts across the nation and within given packs. Consider:

The greatest number of reviews in our study was earned by Hattie B’s Hot Chicken in Nashville, TN, coming in at a whopping 4,537! Meanwhile, Park Heights Restaurant in Tupelo, MS is managing a 3-pack ranking with just 72 reviews, the lowest in our data set.35 percent of “best”-ranked restaurants have between 100–499 reviews and another 31 percent have between 500–999 reviews. Taken together that’s 66 percent of contenders having yet to break 1,000 reviews.A restaurant with less than 100 reviews has only a 1 percent chance of ranking for this type of search.

Anecdotally, I don’t know how much data you would have to analyze to be able to find a truly reliable pattern regarding winning review counts. Consider the city of Dallas, where the #1 spot has 3,365 review, but spots #2 and #3 each have just over 300. Compare that to Tallahassee, where a business with 590 reviews is coming in at #1 above a competitor with twice that many. Everybody ranking in Boise has well over 1,000 reviews, but nobody in Bangor is even breaking into the 200s.

The takeaways from this data point is that the national average review count is 893 for our “best” search, but that there is no average magic threshold you can tell a restaurant client they need to cross to get into the pack. Totals vary so much from city to city that your best plan of action is to study the client’s market and strongly urge full review management without making any promise that hitting 1,000 reviews will ensure them beating out that mysterious competitor who is sweeping up with just 400 pieces of consumer sentiment. Remember, no local ranking factor stands in isolation.

Best restaurants aren’t best at owner responses

How many of America’s top chophouses have replied to reviews in the last 60 days?

With a hat tip to Jason Brown at the Local Search Forum for this example of a memorable owner response to a negative review, I’m sorry to say I have some disappointing news. Only 29 percent of the restaurants ranked best in all 50 states had responded to their reviews in the 60 days leading up to my study. There were tributes of lavish praise, cries for understanding, and seething remarks from diners, but less than one-third of owners appeared to be paying the slightest bit of attention.

On the one hand, this indicates that review responsiveness is not a prerequisite for ranking for our desirable search term, but let’s go a step further. In my view, whatever time restaurant owners may be gaining back via unresponsiveness is utterly offset by what they stand to lose if they make a habit of overlooking complaints. Review neglect has been cited as a possible cause of business closure. As my friends David Mihm and Mike Blumenthal always say:“Your brand is its reviews” and mastering the customer service ecosystem is your surest way to build a restaurant brand that lasts.

For your clients, I would look at any local pack with neglected reviews as representative of a weakness. Algorithmically, your client’s active management of the owner response function could become a strength others lack. But I’ll even go beyond that: Restaurants ignoring how large segments of customer service have moved onto the web are showing a deficit of commitment to the long haul. It’s true that some eateries are famous for thriving despite offhand treatment of patrons, but in the average city, a superior commitment to responsiveness could increase many restaurants’ repeat business, revenue and rankings.

Critic reviews nice but not essential

I’ve always wanted to investigate critic reviews for restaurants, as Google gives them a great deal of screen space in the listings:

How many times were critic reviews cited in the Google listings of America’s best restaurants and how does an establishment earn this type of publicity?

With 57 appearances, Lonely Planet is the leading source of professional reviews for our search term, with Zagat and 10Best making strong showings, too. It’s worth noting that 70/150 businesses I investigated surfaced no critic reviews at all. They’re clearly not a requirement for being considered “best”, but most restaurants will benefit from the press. Unfortunately, there are few options for prompting a professional review. To wit:

Lonely Planet — Founded in 1972, Lonely Planet is a travel guide publisher headquartered in Australia. Critic reviews like this one are written for their website and guidebooks simultaneously. You can submit a business for review consideration via this form, but the company makes no guarantees about inclusion.

Zagat — Founded in 1979, Zagat began as a vehicle for aggregating diner reviews. It was purchased by Google in 2011 and sold off to The Infatuation in 2018. Restaurants can’t request Zagat reviews. Instead, the company conducts its own surveys and selects businesses to be rated and reviewed, like this.

10Best — Owned by USA Today Travel Media Group, 10Best employs local writers/travelers to review restaurants and other destinations. Restaurants cannot request a review.

The Infatuation — Founded in 2009 and headquartered in NY, The Infatuation employs diner-writers to create reviews like this one based on multiple anonymous dining experiences that are then published via their app. The also have a SMS-based restaurant recommendation system. They do not accept request from restaurants hoping to be reviewed.

AFAR — Founded in 2009, AFAR is a travel publication with a website, magazine, and app which publishes reviews like this one. There is no form for requesting a review.

Michelin — Founded as a tire company in 1889 in France, Michelin’s subsidiary ViaMichelin is a digital mapping service that houses the reviews Google is pulling. In my study, Chicago, NYC and San Francisco were the only three cities that yielded Michelin reviews like this one and one article states that only 165 US restaurants have qualified for a coveted star rating. The company offers this guide to dining establishments.

As you can see, the surest way to earn a professional review is to become notable enough on the dining scene to gain the unsolicited notice of a critic. 

Google Posts hardly get a seat at best restaurant tables

How many picks for best restaurants are using the Google Posts microblogging feature?

As it turns out, only a meager 16 percent of America’s “best” restaurants in my survey have made any use of Google Posts. In fact, most of the usage I saw wasn’t even current. I had to click the “view previous posts on Google” link to surface past efforts. This statistic is much worse than what Ben Fisher found when he took a broader look at Google Posts utilization and found that 42 percent of local businesses had at least experimented with the feature at some point.

For whatever reason, the eateries in my study are largely neglecting this influential feature, and this knowledge could encompass a competitive advantage for your restaurant clients.

Do you have a restaurateur who is trying to move up the ranks? There is some evidence that devoting a few minutes a week to this form of microblogging could help them get a leg up on lazier competitors.

Google Posts are a natural match for restaurants because they always have something to tout, some appetizing food shot to share, some new menu item to celebrate. As the local SEO on the job, you should be recommending an embrace of this element for its valuable screen real estate in the Google Business Profile, local finder, and maybe even in local packs.

Waiter, there’s some Q&A in my soup

What is the average number of questions top restaurants are receiving on their Google Business Profiles?

Commander’s Palace in New Orleans is absolutely stealing the show in my survey with 56 questions asked via the Q&A feature of the Google Business Profile. Only four restaurants had zero questions. The average number of questions across the board was eight.

As I began looking at the data, I decided not to re-do this earlier study of mine to find out how many questions were actually receiving responses from owners, because I was winding up with the same story. Time and again, answers were being left up to the public, resulting in consumer relations like these:

Takeaway: As I mentioned in a previous post, Greg Gifford found that 40 percent of his clients’ Google Questions were leads. To leave those leads up to the vagaries of the public, including a variety of wags and jokesters, is to leave money on the table. If a potential guest is asking about dietary restrictions, dress codes, gift cards, average prices, parking availability, or ADA compliance, can your restaurant clients really afford to allow a public “maybe” to be the only answer given?

I’d suggest that a dedication to answering questions promptly could increase bookings, cumulatively build the kind of reputation that builds rankings, and possibly even directly impact rankings as a result of being a signal of activity.

A moderate PA & DA gets you into the game

What is the average Page Authority and Domain Authority of restaurants ranking as “best’?

Looking at both the landing page that Google listings are pointing to and the overall authority of each restaurant’s domain, I found that:

The average PA is 36, with a high of 56 and a low of zero being represented by one restaurant with no website link and one restaurant appearing to have no website at all.The average DA is 41, with a high of 88, one business lacking a website link while actually having a DA of 56 and another one having no apparent website at all. The lowest linked DA I saw was 6.PA/DA do not = rankings. Within the 50 local packs I surveyed, 32 of them exhibited the #1 restaurant having a lower DA than the establishments sitting at #2 or #3. In one extreme case, a restaurant with a DA of 7 was outranking a website with a DA of 32, and there were the two businesses with the missing website link or missing website. But, for the most part, knowing the range of PA/DA in a pack you are targeting will help you create a baseline for competing.

While pack DA/PA differs significantly from city to city, the average numbers we’ve discovered shouldn’t be out-of-reach for established businesses. If your client’s restaurant is brand new, it’s going to take some serious work to get up market averages, of course.

Local Search Ranking Factors 2019 found that DA was the 9th most important local pack ranking signal, with PA sitting at factor #20. Once you’ve established a range of DA/PA for a local SERP you are trying to move a client up into, your best bet for making improvements will include improving content so that it earns links and powering up your outreach for local links and linktations.

Google’s Local Finder “web results” show where to focus management

Which websites does Google trust enough to cite as references for restaurants?

As it turns out, that trust is limited to a handful of sources:

As the above pie chart shows:

The restaurant’s website was listed as a reference for 99 percent of the candidates in our survey. More proof that you still need a website in 2019, for the very good reason that it feeds data to Google.Yelp is highly trusted at 76 percent and TripAdvisor is going strong at 43 percent. Your client is likely already aware of the need to manage their reviews on these two platforms. Be sure you’re also checking them for basic data accuracy.OpenTable and Facebook are each getting a small slice of Google trust, too.

Not shown in the above chart are 13 restaurants that had a web reference from a one-off source, like the Des Moines Register or Dallas Eater. A few very famous establishments, like Brennan’s in New Orleans, surfaced their Wikipedia page, although they didn’t do so consistently. I noticed Wikipedia pages appearing one day as a reference and then disappearing the next day. I was left wondering why.

For me, the core takeaway from this factor is that if Google is highlighting your client’s listing on a given platform as a trusted web result, your agency should go over those pages with a fine-toothed comb, checking for accuracy, activity, and completeness. These are citations Google is telling you are of vital importance.

A few other random ingredients

As I was undertaking this study, there were a few things I noted down but didn’t formally analyze, so consider this as mixed tapas:

Menu implementation is all over the place. While many restaurants are linking directly to their own website via Google’s offered menu link, some are using other services like Single Platform, and far too many have no menu link at all.Reservation platforms like Open Table are making a strong showing, but many restaurants are drawing a blank on this Google listing field, too. Many, but far from all, of the restaurants designated “best” feature Google’s “reserve a table” function which stems from partnerships with platforms like Open Table and RESY. Order links are pointing to multiple sources including DoorDash, Postmates, GrubHub, Seamless, and in some cases, the restaurant’s own website (smart!). But, in many cases, no use is being made of this function. Photos were present for every single best-ranked restaurant. Their quality varied, but they are clearly a “given” in this industry.Independently-owned restaurants are the clear winners for my search term. With the notable exception of an Olive Garden branch in Parkersburg, WV, and a Cracker Barrel in Bismarck, ND, the top competitors were either single-location or small multi-location brands. For the most part, neither Google nor the dining public associate large chains with “best”.Honorable mentions go to Bida Manda Laotian Bar & Grill for what looks like a gorgeous and unusual restaurant ranking #1 in Raleigh, NC and to Kermit’s Outlaw Kitchen of Tupelo, MS for the most memorable name in my data set. You can get a lot of creative inspiration from just spending time with restaurant data.
A final garnish to our understanding of this data

I want to note two things as we near the end of our study:

Local rankings emerge from the dynamic scenario of Google’s opinionated algorithms + public opinion and behavior. Doing Local SEO for restaurants means managing a ton of different ingredients: website SEO, link building, review management, GBP signals, etc. We can’t offer clients a generic “formula” for winning across the board. This study has helped us understand national averages so that we can walk into the restaurant space feeling conversant with the industry. In practice, we’ll need to discover the true competitors in each market to shape our strategy for each unique client. And that brings us to some good news.As I mentioned at the outset of this survey, I specifically avoided proximity as an influence by searching as a traveler to other destinations would. I investigated one local pack for each major city I “visited”. The glad tidings are that, for many of your restaurant clients, there is going to be more than one chance to rank for a search like “best restaurants (city)”. Unless the eatery is in a very small town, Google is going to whip up a variety of local packs based on the searcher’s location. So, that’s something hopeful to share.
What have we learned about restaurant local SEO?

A brief TL;DR you can share easily with your clients:

While the US shows a predictable leaning towards American restaurants, any category can be a contender. So, be bold!Mid-priced restaurants are considered “best” to a greater degree than the cheapest or most expensive options. Price for your market. While you’ll likely need at least 100 native Google reviews to break into these packs, well over half of competitors have yet to break the 1,000 mark.An average 71 percent of competitors are revealing a glaring weakness by neglecting to respond to reviews – so get in there and start embracing customer service to distinguish your restaurant!A little over half of your competitors have earned critic reviews. If you don’t yet have any, there’s little you can do to earn them beyond becoming well enough known for anonymous professional reviewers to visit you. In the meantime, don’t sweat it.About three-quarters of your competitors are completely ignoring Google Posts; gain the advantage by getting active.Potential guests are asking nearly every competitor questions, and so many restaurants are leaving leads on the table by allowing random people to answer. Embrace fast responses to Q&A to stand out from the crowd.With few exceptions, devotion to authentic link earning efforts can build up your PA/DA to competitive levels.Pay attention to any platform Google is citing as a resource to be sure the information published there is a complete and accurate.The current management of other Google Business Profile features like Menus, Reservations and Ordering paints a veritable smorgasbord of providers and a picture of prevalent neglect. If you need to improve visibility, explore every profile field that Google is giving you.

A question for you: Do you market restaurants? Would you be willing to share a cool local SEO tactic with our community? We’d love to hear about your special sauce in the comments below.

Wishing you bon appétit for working in the restaurant local SEO space, with delicious wins ahead!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

WordPress Mistakes Everyone Makes

Posted by on Apr 25, 2019 in Greg's SEO Articles | Comments Off on WordPress Mistakes Everyone Makes

WordPress is the most obvious choice for both personal and professional websites and blogs. In fact, such corporate giants like Facebook, Disney, and Sony use it. There are so many themes, plugins, and features to choose from, and it’s fairly simple to use. But, a lot of users do make mistakes when using it.

1. PICKING THE WRONG PLATFORM

If you’re looking for a platform that can provide you with enough features to host your blog or personal website, then you would probably be satisfied with WordPress.com (which just is a blog you’d write in; its domain would be example.wordpress.com). However, if you prefer to have all your control over what happens with your website or blog, the self-hosted WordPress.org is the way to go.

2. TOO MANY PLUGINS USED

With over 55,000 plugins available, it can be easy to get caught up in them all and install a bunch you don’t need, which then end up slowing down your website. Just stick to the plugins you really need.

3. HAVING YET TO INSTALL A CACHING PLUGIN

Site speed is very important in SEO, so you will need to install a caching plugin. Not only will you decrease your website’s load time, but also prevent downtime and help it withstand huge traffic.

4. NOT HAVING A BACKUP

If your website’s hacked, or an error during update or maintenance occurs, it could take you weeks to get it back to the way it once was. Plugins make entire backups (manual and automatic) of WordPress’ files and database possible.

5. FAILING TO INSTALL UPDATES

Updates are only a click away, and yet they’re so important to keeping your site secure, so make sure to do it regularly.

6. IGNORING MOBILE USERS

With over 50% of all searches are done on mobile devices, and as people switch between their smartphones, tablets, TVs, and computers all the time, your WordPress site needs to look and function just as well for all of them. You could use a plugin called “WPTouch”, but the best solution is to select a theme that works on both platforms.

We Surveyed 1,400 Searchers About Google – Here’s What We Learned

Posted by on Apr 22, 2019 in SEO Articles | Comments Off on We Surveyed 1,400 Searchers About Google – Here’s What We Learned

We Surveyed 1,400 Searchers About Google – Here’s What We Learned

Posted by LilyRayNYC

Google’s search results have seen a whirlwind of major changes in the past two years. Nearly every type of modern-day search queries produce a combination of rich results beyond the standard blue links — Featured Snippets, People Also Ask boxes, Knowledge Panels, maps, images, or other enhancements. It is now even possible to browse flights, hotels, jobs, events, and other searches that were previously only available via external websites, directly on Google.

As search marketers, we are keenly aware that both Google’s evolving landscape and the rise in new, rich results impact our bottom-line — more SERP enhancements and growth in “position 0” means less organic traffic for everyone else. Last year, Rand Fishkin posted a remarkable Whiteboard Friday pointing out the unsettling trend that has emerged from the updates to Google’s interface: there are fewer organic links to external websites as traffic flows to Google-owned assets within the SERP.

We often hear about how the digital marketing community feels about changes to Google’s interface, but it is less common to hear the opinions of the average searcher who is less technically-savvy. At Path Interactive, we conducted a survey of 1,400 respondents to better understand how they search, how they feel about Google’s search results, and the quality of information the search engine provides.

A note about our respondents

72 percent of respondents were based in the U.S., 8 percent in India, and 10 percent in Europe or the U.K. 67.8 percent considered themselves somewhat technically-savvy or not technically-savvy at all. 71.3 percent were under the age of 40.

Click to see a higher-resolution image
Click to see a higher-resolution image

How Often Do Searchers Use Google to Find Things?

It shouldn’t be much of a surprise that the vast majority of respondents — 77 percent — use Google 3+ times a day to search for things online. The frequency of Google usage is also inversely correlated with age; 80 percent of 13–21-year-olds use Google more than three times per day, while only 60 percent of respondents over 60 searches with the same frequency.

Click to see a higher-resolution image
How often do searchers click ads vs. organic results?

As many previous studies have shown, the vast majority of searchers prefer clicking on organic results to clicking on advertisements. 72 percent of respondents stated that they either click only on organic results, or on organic results the majority of the time. Age also plays a role in one’s decision to click on a paid or organic result: Searchers ages 60+ are 200 percent more likely than 18–21-year-olds not to discriminate between a paid and organic listing. Instead, they click on whichever result-type best answers their question.

Click to see a higher-resolution image
Interactions with organic results

The vast majority of respondents remain on the first page of Google to find an answer to their query. 75 percent of respondents either click on the first one or two results, scan page one looking for the most relevant answer to their query, or visit multiple results from page one. 17 percent of respondents stated part of their search behavior includes looking for content from websites or brands that they trust. Only 7 percent of respondents indicated that they browse past the first results page to see as many results as possible.

According to these results, younger users are more likely to click on the first 1–2 results on page one, while older users are more likely to explore additional results, browsing farther down on the first page — or even onto the second and third pages — to find the information they’re looking for.

This trend raises some interesting questions about user behavior: are older searchers more skeptical, and therefore likely to look for a larger variety of answers to their questions? Are younger users more concerned with getting answers quickly, and more likely to settle for the first result they see? Is this tied to the rise in featured snippets? Will this search behavior become the “new normal” as teens grow older, or do younger searchers change their habits over time? If it is the future, will this trend make it even more difficult for organic results that don’t rank in the top three positions to sustain traffic over time?

Click to see a higher-resolution image
How do users feel about featured snippets and the Knowledge Panel?

When it comes to how users feel about featured snippets, the majority of searchers say that their behavior depends on what is displayed in the snippet. Marketers who are concerned that snippets steal traffic away from organic results might be pleased to learn that a relatively low number of respondents — only 22.1 percent — indicate that they generally read the snippet and consider their question answered without clicking the blue link.

Click to see a higher-resolution image

However, this data suggests another potentially alarming trend as it relates to featured snippet interactions and age: the youngest searchers (13–18) are 220 percent more likely than the oldest searchers (70–100) to consider their question answered without clicking on the snippet (or any) result. Conversely, the older respondents (60–100) are 170 percent more likely to continue searching, depending on the answer in the snippet. This again points to younger searchers seeming to prioritize getting a response quickly, while older users are more likely to spend time evaluating a variety of results.

Click to see a higher-resolution image

When it comes to the trustworthiness of featured snippets, most users are on the fence: 44.5 percent of users consider the information “semi-trustworthy,” and continue searching for answers to their questions. However, age once again plays a role in the results. Young searchers (13–30) are 40 percent more likely than older searchers (50+) to trust the information contained in featured snippets. Additionally, the youngest category of searchers (13–18) is 53 percent more likely than average to trust featured snippets.

Click to see a higher-resolution image

The same outcome is true for Knowledge Panel results — the majority of users (55.3 percent) scan this information but continue searching through the other results. However, 36.8 percent of searchers consider the information contained in the Knowledge Panel sufficient to answer their questions, and this represents a decent amount of search traffic that previously flowed to paid and organic results before the existence of the Knowledge Panel.

Click to see a higher-resolution image

As with previous questions, younger users are significantly more likely to consider read the information in the Knowledge Panel and consider their search complete. Young respondents (13–21) are 102 percent more likely to consider the Knowledge Panel a complete answer to their question than older respondents (50+), who generally continue their search after seeing the Knowledge Panel.

Weather forecasts, things to do, jobs, flights, and other Google SERP features

Google has rolled out many new result types that allow searchers to get the answer to their question directly within the search results. This alarms many search marketers, who worry that these results cannibalize traffic that previously flowed to organic results and have caused an increase in “no click searches.” So, how does the average searcher feel about these enhancements to the SERP?

We asked searchers about two types of results: results that directly answer search queries using a proprietary Google widget (such as weather forecasts or “Things to Do”), as well as results that allow for interaction on Google, but include an organic link back to a corresponding website (such as recipes and flight results).

Click to see a higher-resolution image
Click to see a higher-resolution image

According to the data, the majority of respondents use these features but continue browsing the other search results. It is interesting to note that one-third of respondents usually ignore result types such as job listings, events, and flights, and instead skip over to the regular blue links. Older searchers (50+) are 63 percent more likely to ignore these results types and continue their search than younger searchers (13–30).

Incorrect information in SERP features

Our next question was whether searchers have found incorrect information in any of the aforementioned result types. Given Google’s increased focus on content quality and E-A-T, we thought it would be interesting to see the general sentiment around the accuracy of these search features.

Click to see a higher-resolution image

A combined 58.2 percent of searchers state they have either occasionally or frequently seen incorrect information in rich results on Google. This fact is certainly on Google’s radar: just last month, Google published a whitepaper on how it combats disinformation, and the recent major updates to its algorithm reflect Google’s critical recent quest to promote accurate, trustworthy content in all of its results.

How do users feel about Google?

We wanted to know how users feel about Google in general, especially given all the recent changes to Google’s search results. 68 percent of respondents stated that they feel the quality of Google’s results have improved over time, and the majority of respondents don’t have specific complaints about Google.

Among those respondents who do have issues with Google, the most common complaints involve Google showing too many ads, prioritizing content from large corporations, making it harder for small businesses to compete; and showing too many Google-owned assets within the results.

Click to see a higher-resolution image
Click to see a higher-resolution image

We also opened up the survey to allow respondents to leave feedback about how they feel about Google and the quality of its results. The vast majority of responses related to user privacy, the unsettling feeling of sharing private information with the search engine, and disliking that search queries are used in retargeting campaigns. Several respondents were concerned about the political and philosophical implications of Google deciding what content should or should not be prominently featured in its results. Some complaints had to do with the limited options to apply filters and perform advanced searches in both standard results, as well as on Google Images.

Searchers are still skeptical of Google, but there’s some cause for concern among younger users

Should businesses and marketers be worried that Google’s increasingly rich results will slowly steal away our precious traffic for good, and increase the number of no-click results? The results from our Google Usage survey indicate that, at least for now, there’s no need to panic: Searchers are still prone to gravitating toward the regular blue links, both organic and paid. They are largely skeptical about taking all of the information included in rich results at face value.

However, there is data to support that younger searchers are more likely to implicitly trust the information provided in rich results, and less likely to visit deeper pages of the search results during their search journeys. This should be an interesting trend for marketers to pay attention to over time — one that raises many philosophical questions about the role that information from Google should play in our lives.

With its recent push for E-A-T compliance, it’s clear that Google is already grappling with the moral responsibility of providing information that can majorly impact the happiness, safety, and well-being of its users. But what happens when important information doesn’t meet the ranking criteria laid out by Google’s algorithm? What happens when society’s understanding of certain topics and ideas changes over time? Does Google’s algorithm create an echo chamber and limit the ability for users to share and discover diverse viewpoints? What happens when the information Google shares is blatantly wrong, or even worse, dangerous?

While it is important that Google maintains the highest quality standards for displaying credible and trustworthy information, freedom of speech and diversity of ideas must also remain of utmost importance, as future generations become increasingly trusting of the information they discover in the search results.

And now, you tell us: how do you feel about Google’s changing landscape?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

WordPress Security Myths

Posted by on Apr 22, 2019 in Greg's SEO Articles | Comments Off on WordPress Security Myths

Currently almost 33% of websites run WordPress, so it’s important to make sure it’s secure. However, there are a few myths around about its security etc., a handful of them shown below.

1. SAFETY IN NUMBERS

Some believe that, since their website is just one of many, it isn’t interesting enough to get hacked. But, there is no such thing as being not big enough for hackers to notice you.

In fact, if you website’s brand new, some hackers may believe your site to be susceptible to an attack.

2. WORDPRESS IS AN INSECURE CONTENT MANAGEMENT PLATFORM

Another major misconception is that WordPress has a weak protection system. In fact, plugins have advanced enough to protect a website against most attacks (most hacks are found to be due to user error [weak password, outdated security system, etc.] more than anything else).

3. SECURE USERNAME AND PASSWORD SUFFICE

This misconception has led to numerous breaches; this problem can be avoided using a plugin like “All-in-One WP Security” or “WordFence”.

4. SSL CERTIFICATE GUARANTEES SAFETY

SSL only protects the information passed from user to website and vice versa; it doesn’t do anything to protect the data stored on said site. As mentioned before, security plugins can considerably mitigate this problem.

The One-Hour Guide to SEO: Link Building – Whiteboard Friday

Posted by on Apr 22, 2019 in SEO Articles | Comments Off on The One-Hour Guide to SEO: Link Building – Whiteboard Friday

The One-Hour Guide to SEO: Link Building – Whiteboard Friday

Posted by randfish

The final episode in our six-part One-Hour Guide to SEO series deals with a topic that’s a perennial favorite among SEOs: link building. Today, learn why links are important to both SEO and to Google, how Google likely measures the value of links, and a few key ways to begin earning your own.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. We are back with our final part in the One-Hour Guide to SEO, and this week talking about why links matter to search engines, how you can earn links, and things to consider when doing link building.

Why are links important to SEO?

So we’ve discussed sort of how search engines rank pages based on the value they provide to users. We’ve talked about how they consider keyword use and relevant topics and content on the page. But search engines also have this tool of being able to look at all of the links across the web and how they link to other pages, how they point between pages.



So it turns out that Google had this insight early on that what other people say about you is more important, at least to them, than what you say about yourself. So you may say, “I am the best resource on the web for learning about web marketing.” But it turns out Google is not going to believe you unless many other sources, that they also trust, say the same thing. Google’s big innovation, back in 1997 and 1998, when Sergey Brin and Larry Page came out with their search engine, Google, was PageRank, this idea that by looking at all the links that point to all the pages on the internet and then sort of doing this recursive process of seeing which are the most important and most linked to pages, they could give each page on the web a weight, an amount of PageRank.

Then those pages that had a lot of PageRank, because many people linked to them or many powerful people linked to them, would then pass more weight on when they linked. That understanding of the web is still in place today. It’s still a way that Google thinks about links. They’ve almost certainly moved on from the very simplistic PageRank formula that came out in the late ’90s, but that thinking underlies everything they’re doing.

How does Google measure the value of links?

Today, Google measures the value of links in many very sophisticated ways, which I’m not going to try and get into, and they’re not public about most of these anyway. But there is a lot of intelligence that we have about how they think about links, including things like more important, more authoritative, more well-linked-to pages are going to pass more weight when they link.

A.) More important, authoritative, well-linked-to pages pass more weight when they link

That’s true of both individual URLs, an individual page, and websites, a whole website. So for example, if a page on The New York Times links to yoursite.com, that is almost certainly going to be vastly more powerful and influential in moving your rankings or moving your ability to rank in the future than if randstinysite.info — which I haven’t yet registered, but I’ll get on that — links to yoursite.com.

This weighting, this understanding of there are powerful and important and authoritative websites, and then there are less powerful and important and authoritative websites, and it tends to be the case that more powerful ones tend to provide more ranking value is why so many SEOs and marketers use metrics like Moz’s domain authority or some of the metrics from Moz’s competitors out in the software space to try and intuit how powerful, how influential will this link be if this domain points to me.

B.) Diversity of domains, rate of link growth, and editorial nature of links ALL matter

So the different kinds of domains and the rate of link growth and the editorial nature of those links all matter. So, for example, if I get many new links from many new websites that have never linked to me before and they are editorially given, meaning I haven’t spammed to place them, I haven’t paid to place them, they were granted to me because of interesting things that I did or because those sites wanted to editorially endorse my work or my resources, and I do that over time in greater quantities and at a greater rate of acceleration than my competitors, I am likely to outrank them for the words and phrases related to those topics, assuming that all the other smart SEO things that we’ve talked about in this One-Hour Guide have also been done.

C.) HTML-readable links that don’t have rel=”nofollow” and contain relevant anchor text on indexable pages pass link benefit

HTML readable links, meaning as a simple text browser browses the web or a simple bot, like Googlebot, which can be much more complex as we talked about in the technical SEO thing, but not necessarily all the time, those HTML readable links that don’t have the rel=”nofollow” parameter, which is something that you can append to links to say I don’t editorially endorse this, and many, many websites do.

If you post a link to Twitter or to Facebook or to LinkedIn or to YouTube, they’re going to carry this rel=”nofollow,”saying I, YouTube, don’t editorially endorse this website that this random user has uploaded a video about. Okay. Well, it’s hard to get a link from YouTube. And it contains relevant anchor text on an indexable page, one that Google can actually browse and see, that is going to provide the maximum link benefit.

So a href=”https://yoursite.com” great tool for audience intelligence, that would be the ideal link for my new startup, for example, which is SparkToro, because we do audience intelligence and someone saying we’re a tool is perfect. This is a link that Google can read, and it provides this information about what we do.

It says great tool for audience intelligence. Awesome. That is powerful anchor text that will help us rank for those words and phrases. There are loads more. There are things like which pages linked to and which pages linked from. There are spam characteristics and trustworthiness of the sources. Alt attributes, when they’re used in image tags, serve as the anchor text for the link, if the image is a link.

There’s the relationship, the topical relationship of the linking page and linking site. There’s text surrounding the link, which I think some tools out there offer you information about. There’s location on the page. All of this stuff is used by Google and hundreds more factors to weight links. The important part for us, when we think about links, is generally speaking if you cover your bases here, it’s indexable, carries good anchor text, it’s from diverse domains, it’s at a good pace, it is editorially given in nature, and it’s from important, authoritative, and well linked to sites, you’re going to be golden 99% of the time.

Are links still important to Google?

Many folks I think ask wisely, “Are links still that important to Google? It seems like the search engine has grown in its understanding of the web and its capacities.” Well, there is some pretty solid evidence that links are still very powerful. I think the two most compelling to me are, one, the correlation of link metrics over time. 

So like Google, Moz itself produces an index of the web. It is billions and billions of pages. I think it’s actually trillions of pages, trillions of links across hundreds of billions of pages. Moz produces metrics like number of linking root domains to any given domain on the web or any given page on the web.

Moz has a metric called Domain Authority or DA, which sort of tries to best replicate or best correlate to Google’s own rankings. So metrics like these, over time, have been shockingly stable. If it were the case someday that Google demoted the value of links in their ranking systems, basically said links are not worth that much, you would expect to see a rapid drop.

But from 2007 to 2019, we’ve never really seen that. It’s fluctuated. Mostly it fluctuates based on the size of the link index. So for many years Ahrefs and Majestic were bigger link indices than Moz. They had better link data, and their metrics were better correlated.

Now Moz, since 2018, is much bigger and has higher correlation than they do. So the various tools are sort of warring with each other, trying to get better and better for their customers. You can see those correlations with Google pretty high, pretty standard, especially for a system that supposedly contains hundreds, if not thousands of elements.

When you see a correlation of 0.25 or 0.3 with one number, linking root domains or page authority or something like that, that’s pretty surprising. The second one is that many SEOs will observe this, and I think this is why so many SEO firms and companies pitch their clients this way, which is the number of new, high quality, editorially given linking root domains, linking domains, so The New York Times linked to me, and now The Washington Post linked to me and now wired.com linked to me, these high-quality, different domains, that correlates very nicely with ranking positions.

So if you are ranking number 12 for a keyword phrase and suddenly that page generates many new links from high-quality sources, you can expect to see rapid movement up toward page one, position one, two, or three, and this is very frequent.

How do I get links?

Obviously, this is not alone, but very common. So I think the next reasonable question to ask is, “Okay, Rand, you’ve convinced me. Links are important. How do I get some?” Glad you asked. There are an infinite number of ways to earn new links, and I will not be able to represent them here. But professional SEOs and professional web marketers often use tactics that fall under a few buckets, and this is certainly not an exhaustive list, but can give you some starting points.

1. Content & outreach

The first one is content and outreach. Essentially, the marketer finds a resource that they could produce, that is relevant to their business, what they provide for customers, data that they have, interesting insights that they have, and they produce that resource knowing that there are people and publications out there that are likely to want to link to it once it exists.

Then they let those people and publications know. This is essentially how press and PR work. This is how a lot of content building and link outreach work. You produce the content itself, the resource, whatever it is, the tool, the dataset, the report, and then you message the people and publications who are likely to want to cover it or link to it or talk about it. That process is tried-and-true. It has worked very well for many, many marketers. 

2. Link reclamation

Second is link reclamation. So this is essentially the process of saying, “Gosh, there are websites out there that used to link to me, that stopped linking.” The link broke. The link points to a 404, a page that no longer loads on my website.

The link was supposed to be a link, but they didn’t include the link. They said SparkToro, but they forgot to actually point to the SparkToro website. I should drop them a line. Maybe I’ll tweet at them, at the reporter who wrote about it and be like, “Hey, you forgot the link.” Those types of link reclamation processes can be very effective as well.

They’re often some of the easiest, lowest hanging fruit in the link building world. 

3. Directories, resource pages, groups, events, etc.

Directories, resource pages, groups, events, things that you can join and participate in, both online or online and offline, so long as they have a website, often link to your site. The process is simply joining or submitting or sponsoring or what have you.

Most of the time, for example, when I get invited to speak at an event, they will take my biography, a short, three-sentence blurb, that includes a link to my website and what I do, and they will put it on their site. So pitching to speak at events is a way to get included in these groups. I started Moz with my mom, Gillian Muessig, and Moz has forever been a woman-owned business, and so there are women-owned business directories.

I don’t think we actually did this, but we could easily go, “Hey, you should include Moz as a woman-owned business.We should be part of your directory here in Seattle.” Great, that’s a group we could absolutely join and get links from. 

4. Competitors’ links

So this is basically the practice you almost certainly will need to use tools to do this. There are some free ways to do it.

The simple, free way to do it is to say, “I have competitor 1 brand name and competitor 2 brand name.I’m going to search for the combination of those two in Google, and I’m going to look for places that have written about and linked to both of them and see if I can also replicate the tactics that got them coverage.” The slightly more sophisticated way is to go use a tool. Moz’s Link Explorer does this.

So do tools from people like Majestic and Ahrefs. I’m not sure if SEMrush does. But basically you can plug in, “Here’s me. Here’s my competitors. Tell me who links to them and does not link to me.” Moz’s tool calls this the Link Intersect function. But you don’t even need the link intersect function.

You just plug in a competitor’s domain and look at here are all the links that point to them, and then you start to replicate their tactics. There are hundreds more and many, many resources on Moz’s website and other great websites about SEO out there that talk about many of these tactics, and you can certainly invest in those. Or you could conceivably hire someone who knows what they’re doing to go do this for you. Links are still powerful. 

Okay. Thank you so much. I want to say a huge amount of appreciation to Moz and to Tyler, who’s behind the camera — he’s waving right now, you can’t see it, but he looks adorable waving — and to everyone who has helped make this possible, including Cyrus Shepard and Britney Muller and many others.

Hopefully, this one-hour segment on SEO can help you upgrade your skills dramatically. Hopefully, you’ll send it to some other folks who might need to upgrade their understanding and their skills around the practice. And I’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

In case you missed them:

Check out the other episodes in the series so far:

The One-Hour Guide to SEO, Part 1: SEO StrategyThe One-Hour Guide to SEO, Part 2: Keyword ResearchThe One-Hour Guide to SEO, Part 3: Searcher SatisfactionThe One-Hour Guide to SEO, Part 4: Keyword Targeting & On-Page OptimizationThe One-Hour Guide to SEO, Part 5: Technical SEO

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Recover WordPress SEO after a Hack

Posted by on Apr 18, 2019 in Greg's SEO Articles | Comments Off on Recover WordPress SEO after a Hack

If you’ve already restored and secured your website and you’re now more concerned with making sure that the search engine optimization of your site hasn’t been affected, here are some other things to do after a hack.

DOWNLOAD LATEST WORDPRESS AGAIN

This’ll ensure that you’re running the latest version and that all of your files are clear of malicious code.

CHECK WEBSITE FOR MALWARE THROUGH THE THEME EDITOR

This’ll help you confirm that the malware has been removed.

SET UP REDIRECTS (IF NECESSARY OR APPROPRIATE)

If the hackers created temporary pages on your website, setup 301 redirects. This’ll improve user experience and retain the value of any compromised pages indexed by search engines before you fixed them.

CHECK METADATA

A common hacking trick is changing website metadata. Just check your meta titles and meta descriptions.

CHECK WEBMASTER TOOLS

Google’s Webmaster Tools will help you see any malicious content on your site. You can even request a re-crawl of your site after cleanup.

Why Your WordPress Site Doesn’t Rank High on Google

Posted by on Apr 15, 2019 in Greg's SEO Articles | Comments Off on Why Your WordPress Site Doesn’t Rank High on Google

A website is critical to the success of your online business – it’s the first thing users will look for, so you have to make sure they can find you easily. Search engine optimization (SEO) is the only way to achieve this.

Ergo it is essential to understand the cause of low ranking, so let’s take a look!

GOOGLE HASN’T INDEXED YOUR WEBSITE

This happens to newly-launched WordPress sites since it takes a few weeks for this search engine giant to detect and list you.

Simply search ‘site:’ and add your domain name after it in the search box. If your website shows up as the first result, Google has indexed it, but if you can’t find it, then it may take a few days before you find it using that search query.

META TAGS ARE POORLY OPTIMIZED

Meta tags are website components that allow Google to analyze and rank your content. You have to make sure meta tags are crafted properly:

. Meta description: explains webpage content (up to 160 characters)
. Meta title: another way to keep Google informed about the content
. Image tags: visual elements are also crawled and analyzed, so make sure to mark images adequately.

WEBSITE’S NOT MOBILE-FRIENDLY

Your job is to build a responsive site that functions perfectly across multiple devices. This is best done by using a new, mobile-friendly theme, but it can also be done plugin-wise (WPTouch and JetPack being just two examples).

YOUR CONTENT IS BAD

Content is one thing that really has the power to make or break your online efforts. A typical first-page result on Google contains almost 2,000 words, so today’s users want to see, read, or hear in-depth analyses – it covers topics from all angles, giving visitors a thorough explanation of the topic in question.

YOU HAVE NO BACKLINKS

If no one shares or wants to share links that lead back to your website, it probably means you aren’t publishing quality content. Google takes it into account and labels your site as irrelevant, so you can’t rank much higher in user searches.

YOUR SITE LOADS SLOWLY

visitors typically expect a web page to load in less than three seconds. What you need to do make that possible is test your site’s load speed (such as Pingdom). These tests will show you everything you can do to make your site load faster.

YOU DON’T HAVE ANY SOCIAL MEDIA ACTIVITY

Social buzz improves the visibility of pages, so if you don’t have business accounts on platforms like Facebook or Instagram, you should start using them right away.

YOU HAVE TOO MANY AFFILIATE LINKS

Websites that focus on affiliate marketing instead of quality content are going to get penalized by Google sooner or later.

TOUGH COMPETITION

Lastly, competition may be too tough for a certain keyword. If that’s the case, focus on a less competitive niche by targeting the right audience and finding keywords that won’t overlap with the ones used by industry leaders.

Reducing WordPress Site CPU Usage

Posted by on Apr 11, 2019 in Greg's SEO Articles | Comments Off on Reducing WordPress Site CPU Usage

While it’s praised for its ease of use and user-friendliness, it can also be resource-hungry. Fortunately there are simple fixes to said issue.

AVOID FANCY DESIGNS AND ANY GIMMICKS YOU WON’T BE USING

Make sure first and foremost that your website is fast-loading and message-focused before elegence – otherwise, you’re going to have a site that doesn’t convert well. Having a slider, for example, is nice but you will have to resize the images considerably so it doesn’t slow nearly everything (literally) down.

CHECK YOUR WEBSITE PLUGINS CAREFULLY

Sometimes what’s most resource-demanding is hidden behind the scenes because of plugins. Disable them one by one and test your site’s performance; then verify CPU usage when they’re turned off. Doing this will show you which you need to remove or replace.

OPTIMIZE IMAGES

Just as mentioned regarding your slider/s, if you don’t optimize them, they will demand their share of resources, rendering WordPress to operate less than optimally. We recommend a plugin call “resmush.it” which is very fast at such a process, and optimizes images upon upload.

GET A CONTENT DELIVERY NETWORK

By doing so, like using Cloudflare for instance, certain static files on your site (images, videos, etc.) are loaded from external servers, leaving more power for WordPress. Ergo, usage spikes are prevented.

CLEAN THE DATABASE REGULARLY

It’s not just the WP database that must be taken care of – many plugins grow in size over time by cluttering said database and require regular cleanups. There are plenty of plugins available for database cleanups and don’t require technical knowledge.

GET A CACHING PLUGIN

Instead of generating content every time a particular user visits your website, such a plugin will create static versions of your site’s webpages. Therefore, CPU usage is decreased even further.

How To Force Q&A On a GMB Page That Doesn’t Have It

Posted by on Apr 11, 2019 in SEO Articles | Comments Off on How To Force Q&A On a GMB Page That Doesn’t Have It

How To Force Q&A On a GMB Page That Doesn’t Have It

Here’s a stupid little GMB Q&A thing I figured out yesterday I thought you all might enjoy.

I was asked by mi nuevo amigo, Ruben Coll Molina of PA Digital in Spain, what is the event that triggers the Q&A functionality in a GMB profile? Ruben had found that some of their SMB customers did not have the functionality. He sent me to this SERP for “nouvelle couquette”, a clothing store in Torrent, Spain. At the time, their GMB did not display the “Ask a Question” module like this:

Ok, I doctored it. Of course I forgot to take a “before” screenshot, but trust me, I’m an SEO consultant…

Anyhow, I searched for “women’s clothing stores in Torrent, Spain,” got a local pack then clicked on the “More Places” link and saw Nouvelle Couquette listed in the Google Maps Local Finder, but this time it had the Q&A widget, but no questions had been asked:

On a hunch, using my best 7th grade Spanish, I asked a question:

Ruben answered:

A few seconds later we witnessed El Milagro de Las Preguntas y Respuestas:

Quien es mas macho?