Blog

Best WP SEO Plugins

Posted by on Jun 21, 2019 in Greg's SEO Articles | Comments Off on Best WP SEO Plugins

There are plenty of great plug-ins, but the ones on this list will enhance your SEO efforts.

1. SEO Yoast

This plugin’s probably one of the most used and popular WordPress SEO plugins by over five million websites. One of its best features is the XML sitemap management which allows you to easily create your own sitemaps. You don’t have to code and then fix it if something in’t working.

For content lovers, there’s the content optimization snippet preview which allows you to add your keyword, meta description and meta title to preview them as they appear on search. You also get tips and indications whether your content needs more on-site optimization, including reduction of keyword stuffing.

Moreover, Yoast SEO helps you identify and avoid duplicate content so you don’t get penalized by Google.

2. SEO Framework

Here’s another great plugin for small businesses instead of big companies. The interface looks like it’s integrated into WordPress, so it delivers fast SEO solutions and it’s time efficient. Not to mention that interacting with it feels very natural.

It has an AI built making it very interesting and it automatically optimizes your pages, so it gives you lots of possibilities to create a better website. It comes preconfigured but also gives you the option to change any settings you want. You can improve search results and the social presence too.

3. Broken Link Checker

This plugin parses your whole website and shows you how many broken links you have. You can find said list in a new tab of the WP admin panel – “Tools” -> “Broken Links”. Whenever you find them, there are some actions you can take: “Edit link”, “Unlink”, “Not broken”, and “Dismiss”.

4. All in One Schema Rich Snippets

This plugin can be used to improve the appearance in search engine results with rich snippets. The plugin can be used at its best for schema implementations, such as Recipes, Events, People, Products, Articles and so on.

Using it will give more accurate information to search engines about your website, help your results stand out in SERP and give you a competitive advantage.

5. Rank Math

This plugin helps you optimize your content and outrank your competitors. One of the coolest things is that it supports schema-based themes and also AMP pages.

With Rank Math you can check lots of errors and get a lot of information for your website:

. easy setup using the step-by-step installation and configuration wizard;
. rank tracking option to follow your keywords positions and LSI keyword integration;
. advanced website analysis section to spot any errors that need to be fixed;
. a modular framework so you can have complete control of your website;
. smart redirection manager;
. 40 monitor that identifies and fixes any 404 pages;
. internal linking management and suggestion;
. Google Search Console Integration;
. Easy configuration for rich snippets and so many more.

6. All in One SEO Pack

Here’s an easy WordPress plugin for beginners and small businesses that want to improve their website and increase their rankings, but it does has advanced features and an API for developers; for example:

. XML Sitemap support;
. Google AMP support;
. Google Analytics Integration;
. Webmaster verification options for Google, Bing, and Pinterest;
. Automatically generated meta tags;
. Built-in API and compatibility with a lot of other plugins;
. advanced canonical URLs and many more.

7. SEOPress

This simple fast and very powerful SEO plugin has loads of features that you can easily enable or disable as per required:

. Discover your suggestion for your content through Google’s suggestion.
. Fine tune with a content analysis tool.
. You can track Google event and traffic from the dashboard.
. It is very easy to create and manage 301, 302 and 307 redirects.
. You will be able to check the performance of your site with Google page speed.
. It allows you to implement Google structured data, such as product, article, event, local business, review, video, course, recipe and so on.

Optimize Your Posts before Publishing

Posted by on Jun 18, 2019 in Greg's SEO Articles | Comments Off on Optimize Your Posts before Publishing

Optimizing for SEO is a process of improvement/s, especially when it comes to your webpage or blog content. Here’s a list of recommendations on how to do that.

USE THE BEST TITLE FOR YOUR ARTICLE

You don’t want to write something with a title that’s boring or too long, so make one that’s instantly catchy but is also 50-60 characters long.

It’s easier than you think! If you search “Title Creator” or “Title Generator” you’ll be able to make plenty of good headlines for your post.

USE THE BEST PERFORMING KEYWORDS

This is easy to do as well! Simply go to Google’s very own Keword Planner Tool and find a keyword that’s not too short but not too long, and has plenty of search volume. After that, make sure to place those specific keywords in the title, the post’s URL, and the article body in the first 100 words or first paragraph.

MAKE SURE TO USE H1 TAGS FOR YOUR TITLE

H1 is the tag Google’s search engine looks for when forming its search results, so make sure your title has it! You really only want to use H2 and H3 for distinction within your article.

BUILD METADATA FOR ANY IMAGES / VIDEOS IN THE ARTICLE

Metadata explains an image or video to Google’s web crawlers. This info (in the case of an image) is the title, alt text and its caption. Metadata can also help site visitors should an image fail to load. WordPress is able to add all this in the Media Library.

MAKE IMAGE SMALLER TO MAKE IT LOAD FASTER

This can’t be stressed enough: make sure your post loads fast for viewers, and the best way to do that is to size your images properly.

MAKE CONTENT ENGAGING

Just like your title your content needs to be engaging as well. One way is to add links in your content.

MAKE PARAGRAPHS SHORT/SHORTER

Google and SEO experts say that shorter paragraphs are easier for Google to crawl for. It also makes reading easier too, because most readers scan the article before reading it. Make sure complex sentences are simpler to read (by using Grammarly, Ginger and HemingwayApp) but also highlight key points in your article.

MAKE SOCIAL BUTTONS EASY TO SEE AND USE

Social media are huge traffic sources to share content, so make sure such buttons are visible (nearest the top as possible) so users can share your posts. A plugin can do this automatically for you!

USE SEO PLUGINS

Yoast is a perfect example! It helps you form SEO-friendly titles and descriptions (metadata as well), recommends where to place your target keyword, and provides tips as you write.

Secure WordPress from Hackers

Posted by on May 28, 2019 in Greg's SEO Articles | Comments Off on Secure WordPress from Hackers

According to WP White Security, more than 70% of WordPress websites are vulnerable to a hack attack! But don’t worry – it’s easier to deal with than you think!

INSTALL WORDPRESS SECURITY PLUGINS

Never is it a bad idea to find the security plugin that works for you! “WP All-in-One Security” is the one we use and recommend to clients because it has Brute-Force Protection, a File-Change Scanner, a Firewall, and much more! It even has a “Security Strength Meter” to show now secure you can make your site!

USE STRONG CREDENTIALS

Just create a password (even the one that can be auto-generated by WordPress) that’s at least 12 characters in length and doesn’t just use letters (regardless of capitals or lowercase) or numbers.

KEEP A CONSTANT WEBSITE BACKUP

This is a must-have regardless what happens! Use “BackWPUp” and have it run a backup at the very least once a month!

SCAN YOUR SITE FOR MALWARE

Plugins make it possible to scan your WordPress site for malicious code using a scanner like “Anti-Malware from GOTMLS.NET” – get it for free and scan your site immediately!

LIMIT YOUR LOGIN ATTEMPTS

“All-in-One WP Security” can do this too, so make sure to limit the number of attempted logins so a hacker can’t get in after a number of failed tries to break in!

CREATE A NEW ADMIN IF “ADMIN” IS YOUR USERNAME

Hackers will use the username “admin” because it’s the most commonly used username in WordPress. Make a new one with administrator rights and delete the old one so they can’t get in!

USE TWO-FACTOR AUTHENTICATION

“All-in-One WP Security” does this too by adding a CAPTCHA, but you can even send you a verification code by cell-phone! This makes it much harder for a potential hacker to get into your site!

KEEP WORDPRESS UP TO DATE

This is easily done without you pressing a single button by adding this to your wp-config.php:

define( 'WP_AUTO_UPDATE_CORE', true );

WordPress Security Checklist

Posted by on May 21, 2019 in Greg's SEO Articles | Comments Off on WordPress Security Checklist

We have discussed security on our blog at lenghth, but it can’t be stressed enough; do yourself a favor and check these must-do’s to keep your WordPress site secure!

USE SECURE HOSTING

First and foremost is secure hosting – don’t go for the cheapest package you can find; that’ll make it too easy for your site to be attacked and therefore make your hosting needs become a frustration.

HIDE WORDPRESS IDENTIFIERS

This means you have to make sure you hide anything that shows your site is that of WordPress. A perfect example is the “wp-admin” login page, which can be changed using security plugins like “All-in-One WP Security”. The ‘created on WordPress’ tagline is one you should remove ASAP as well.

INSTALL A SECURITY PLUGIN

This cannot be stressed enough because it makes security and management thereof so much easier. One way to find a plugin that suits your needs is Google search “WordPress security plugins for [INSERT INDUSTRY TAG HERE] websites”. For example, if your website is eCommerce-intensive, you can search “WordPress security plugins for eCommerce websites”, and as such might need more robust platform than just a free plugin can do.

The afore-mentioned “WP All-in-One Security” and “WordFence” are both prime examples of free yet secure options for most websites.

KEEP PASSWORDS SECURE

This is obvious but make a password that barely anyone would be able to guess. You do have to keep it in a safe place, though, so one free tool – “BitWarden” – manages all your passwords via “Master Password”.

PROTECT INPUT FIELDS

What we mean by this is, you should make sure everything such as your contact forms have a Captcha system, or some way to prove that such form submissions are not by robots in any way. This even includes your login page, where a security plugin (even a free one) can place a Captcha there so brute force attacks against your website become much more difficult.

GENERATE BACKUPS

This may seem obvious but this is a simple mistake to make and yet perhaps the most costly. A free plugin – “BackWPUp” – can take care of this for you right away and even can be scheduled to run entire backups of your site/s.

STAY UP-TO-DATE

Whether you see a WordPress version, theme or plugin that says it needs updating, do as soon as you can if not right away. It will go a long way towards keeping your site secure.

In fact, There is a way to do this automatically without pressing any buttons! Put this code at the bottom of your “wp-config.php” file:

         define( 'WP_AUTO_UPDATE_CORE', true ); 

Then, put this code at the bottom of your activated theme’s “functions.php” file:

         add_filter( 'auto_update_plugin', '__return_true' );
         add_filter( 'auto_update_theme', '__return_true' ); 

Multilingual Plugins for WordPress

Posted by on May 16, 2019 in Greg's SEO Articles | Comments Off on Multilingual Plugins for WordPress

By sticking to just one language on your website you’re limiting your sales from around the world. As people find new ways to communicate instantly, you’re more and more likely to communicate with people whose first language isn’t English. You’ll be able to reach far more people by translating text into their language. Here are 4 plugins we recommend:

POLYLANG

With this plugin you can add as many languages as you need to every page or post that you’re working on. Although it doesn’t do the translation for you, it does make it very easy to scale your content for multilingual purposes – you don’t have to have multiple websites with different languages, because a user logging onto your site merely has to select a language of their choice.

LOCO TRANSLATE

This plugin does more than just translate posts – it uses the readily available variety of languages WordPress has and does the translation from there. It is $5.95 monthly but you can translate 2,000 words for free.

WORDPRESS MULTINGUAL PLUGIN (WPML)

This plugin lets you automatically or even manually translate your content. More than 40 languages are available for auto-translation. It will cost $29 for “Standard” and $79 for “Advanced” but it has a solid reputation.

GOOGLE LANGUAGE TRANSLATOR

Last but not least is the Google Language Translator. Not only is it free, but Google Translate has become substantially more accurate, and it can support over 100 languages!

Reading Between the Lines – Three Deeper Takeaways from John Mueller at BrightonSEO

Posted by on May 14, 2019 in SEO Articles | Comments Off on Reading Between the Lines – Three Deeper Takeaways from John Mueller at BrightonSEO

Reading Between the Lines – Three Deeper Takeaways from John Mueller at BrightonSEO

Last Friday I had the pleasure of watching John Mueller of Google being interviewed on the BrightonSEO main stage by (Distilled alumna!) Hannah Smith. I found it hugely interesting how different it was from the previous similarly formatted sessions with John I’ve seen – by Aleyda at BrightonSEO previously, and more recently by my colleague Will Critchlow at SearchLove. In this post, I want to get into some of the interesting implications in what John did and, crucially, did not say.

I’m not going to attempt here to cover everything John said exhaustively – if that’s what you’re looking for, I recommend this post by Deepcrawl’s Sam Marsden, or this transcript via Glen Allsopp (from which I’ve extracted below). This will also not be a tactical post – I was listening to this Q&A from the perspective of wanting to learn more about Google, not necessarily what to change in my SEO campaigns on Monday morning.

Looking too closely?

I’m aware of the dangers of reading too much into the minutia of what John Mueller, Garry Ilyes, and crew come out with – especially when he’s talking live and unscripted on stage. Ultimately, as John said himself, it’s his job to establish a flow of information between webmasters and search engineers at Google. There are famously few people, or arguably no people at all, who know the ins and outs of the search algorithm itself, and it is not John’s job to get into it in this depth.

That said, he has been trained, and briefed, and socialised, to say certain things, to not say certain things, to focus on certain areas, and so on. This is where our takeaways can get a little more interesting than the typical, clichéd “Google says X” or “we think Google is lying about Y”. I’d recommend this presentation and deck from Will if you want to read more about that approach, and some past examples.

So, into the meat of it.

1. “We definitely use links to recognize new content”

Hannah: Like I said, this is top tier sites…  Links are still a ranking factor though, right? You still use links as a ranking factor?

John: We still use links. I mean it’s not the only ranking factor, so like just focusing on links, I don’t think that makes sense at all… But we definitely use links to recognize new content.

Hannah: So if you then got effectively a hole, a very authoritative hole in your link graph… How is that going to affect how links are used as a ranking factor or will it?

John: I dunno, we’ll see. I mean it’s one of those things also where I see a lot of times the sites that big news sites write about are sites that already have links anyway. So it’s rare that we wouldn’t be able to find any of that new content. So I don’t think everything will fall apart. If that happens or when that happens, but it does make it a little bit harder for us. So it’s kind of tricky, but we also have lots of other signals that we look at. So trying to figure out how relevant a page is, is not just based on the links too.

The context here is that Hannah was interested in how much of a challenge it is for Google when large numbers of major editorial sites start adding the “nofollow” attribute to all their external links – which has been a trend of late in the UK, and I suspect elsewhere. If authoritative links are still an important trust factor, does this not weaken that data?

The interesting thing for me here was very much in what John did not say. Hannah asks him fairly directly whether links are a ranking factor, and he evades three times, by discussing the use of links for crawling & discovering content, rather than for establishing a link graph and therefore a trust signal:

“We still use links”
“We definitely use links to recognize new content”
“It’s rare we wouldn’t be able to find any of that new content”

There’s also a fourth example, earlier in the discussion – before the excerpt above –  where he does the same:

“…being able to find useful content on the web, links kind of play a role in that.”

This is particularly odd as in general, Google is pretty comfortable still discussing links as a ranking factor. Evidently, though, something about this context caused this slightly evasive response. The “it’s not the only ranking factor” response feels like a bit of an evasion too, given that Google essentially refuses to discuss other ranking factors that might establish trust/authority, as opposed to just relevance and baseline quality – see my points below on user signals!

Personally, I also thought this comment was very interesting and somewhat vindicating of my critique of a lot of ranking factor studies:

“…a lot of the times the sites that big news sites write about are sites that already have links anyway”

Yeah, of course – links are correlated with just about any other metric you can imagine, whether it be branded search volume, social shares, click-through rate, whatever.

2. Limited spots on page 1 for transactional sites

Hannah: But thinking about like a more transactional query, for example. Let’s just say that you want to buy some contact lenses, how do you know if the results you’ve ranked first is the right one? If you’ve done a good job of ranking those results?

John: A lot of times we don’t know, because for a lot of these queries there is no objective, right or wrong. They’re essential multiple answers that we could say this could make sense to show as the first result. And I think in particular for cases like that, it’s useful for us to have those 10 blue links or even 10 results in the search page, where it’s really something like we don’t completely know what you’re looking for. Are you looking for information on these contact lenses? Do you want to buy them? Do you want to compare them? Do you want to buy a specific brand maybe from this-

This is one of those things where I think I could have figured this out from the information I already had, but it clicked into place for me listening to this explanation from John. If John is saying there’s a need to show multiple intents on the first page for even a fairly commercial query, there is an implication that only so many transactional pages can appear.

Given that, in many verticals, there are far more than 10 viable transactional sites, this means that if you drop from being the 3rd best to the 4th best among those, you could drop from, for example, position 5 to position 11. This is particularly important to keep in mind when we’re analysing search results statistically – whether it be in ranking factor studies or forecasting the results of our SEO campaigns, the relationship between the levers we pull and the outputs we see can be highly non-linear. A small change might move you 6 ranking positions, past sites which have a different intent and totally different metrics when it comes to links, on-page optimisation, or whatever else.

3. User signals as a ranking factor

Hannah: Surely at that point, John, you would start using signals from users, right? You would start looking at which results are clicked through most frequently, would you start looking at stuff like that at that point?

John: I don’t think we would use that for direct ranking like that. We use signals like that to analyze the algorithms in general, because across a million different search queries we can figure out like which one tends to be more correct or not, depending on where people click. But for one specific query for like a handful of pages, it can go in so many different directions. It’s really-

So, the suggestion here is that user signals – presumably CTR (click-through rates), dwell time, etc. – are used to appraise the algorithm, but not as part of the algorithm. This has been the line from Google for a while, but I found this response far more explicit and clear than John M’s skirting round the subject in the past.

It’s difficult to square this with some past experiments from the likes of Rand Fishkin manipulating rankings with hundreds of people in a conference hall clicking results for specific queries, or real world results I’ve discussed here. In the latter case, we could maybe say that this is similar to Panda – Google has machine learned what on-site attributes go with users finding a site trustworthy, rather than measuring trust & quality directly. That doesn’t explain Rand’s results, though.

Here are a few explanations I think are possible:

Google just does not want to admit to this, because it’d look spammable (whether or not it actually is)
In fact, they use something like “site recent popularity” as part of the algorithm, so, on a technicality, don’t need to call it CTR or user signals
The algorithm is constantly appraising itself, and adjusts in response to a lot of clicks on a result that isn’t p1 – but the ranking factor that gets adjusted is some arbitrary attribute of that site, not the user signal itself

Just to explain what I mean by the third one a little further – imagine if there are three sites ranking for a query, which are sites A, B, & C. At the start, they rank in that order – A, B, C. It just so happens, by coincidence, that site C has the highest word count.

Lots of people suddenly search the query and click on result C. The algorithm is appraising itself based on user signals, for example, cases where people prefer the 3rd place result, so needs to adjust to make this site rank higher. Like any unsupervised machine learning, it finds a way, any way, to fit the desired outcome to the inputs for this query, which in this case is weighting word count more highly as a ranking factor. As such, result C ranks first, and we all claim CTR is the ranking factor. Google can correctly say CTR is not a ranking factor, but in practice, it might as well be.

For me, the third option is the most contrived, but also fits in most easily with my real world experience, but I think either of the other explanations, or even all 3, could be true.

Discussion

I hope you’ve enjoyed my rampant speculation. It’s only fair that you get to join in too: tweet me at @THCapper, or get involved in the comments below.

How Often Does Google Update Its Algorithm?

Posted by on May 14, 2019 in SEO Articles | Comments Off on How Often Does Google Update Its Algorithm?

How Often Does Google Update Its Algorithm?

Posted by Dr-Pete

In 2018, Google reported an incredible 3,234 improvements to search. That’s more than 8 times the number of updates they reported in 2009 — less than a decade ago — and an average of almost 9 per day. How have algorithm updates evolved over the past decade, and how can we possibly keep tabs on all of them? Should we even try?

To kick this off, here’s a list of every confirmed count we have (sources at end of post):

2018 – 3,234 “improvements”2017 – 2,453 “changes”2016 – 1,653 “improvements”2013 – 890 “improvements”2012 – 665 “launches”2011 – 538 “launches”2010 – 516 “changes”2009 – 350–400 “changes”

Unfortunately, we don’t have confirmed data for 2014-2015 (if you know differently, please let me know in the comments).

A brief history of update counts

Our first peek into this data came in spring of 2010, when Google’s Matt Cutts revealed that “on average, [Google] tends to roll out 350–400 things per year.” It wasn’t an exact number, but given that SEOs at the time (and to this day) were tracking at most dozens of algorithm changes, the idea of roughly one change per day was eye-opening.

In fall of 2011, Eric Schmidt was called to testify before Congress, and revealed our first precise update count and an even more shocking scope of testing and changes:

“To give you a sense of the scale of the changes that Google considers, in 2010 we conducted 13,311 precision evaluations to see whether proposed algorithm changes improved the quality of its search results, 8,157 side-by-side experiments where it presented two sets of search results to a panel of human testers and had the evaluators rank which set of results was better, and 2,800 click evaluations to see how a small sample of real-life Google users responded to the change. Ultimately, the process resulted in 516 changes that were determined to be useful to users based on the data and, therefore, were made to Google’s algorithm.”

Later, Google would reveal similar data in an online feature called “How Search Works.” Unfortunately, some of the earlier years are only available via the Internet Archive, but here’s a screenshot from 2012:

Note that Google uses “launches” and “improvements” somewhat interchangeably. This diagram provided a fascinating peek into Google’s process, and also revealed a startling jump from 13,311 precisions evaluations (changes that were shown to human evaluators) to 118,812 in just two years.

Is the Google algorithm heating up?

Since MozCast has kept the same keyword set since almost the beginning of data collection, we’re able to make some long-term comparisons. The graph below represents five years of temperatures. Note that the system was originally tuned (in early 2012) to an average temperature of 70°F. The redder the bar, the hotter the temperature …

Click to open a high-resolution version in a new tab

You’ll notice that the temperature ranges aren’t fixed — instead, I’ve split the label into eight roughly equal buckets (i.e. they represent the same number of days). This gives us a little more sensitivity in the more common ranges.

The trend is pretty clear. The latter half of this 5-year timeframe has clearly been hotter than the first half. While warming trend is evident, though, it’s not a steady increase over time like Google’s update counts might suggest. Instead, we see a stark shift in the fall of 2016 and a very hot summer of 2017. More recently, we’ve actually seen signs of cooling. Below are the means and medians for each year (note that 2014 and 2019 are partial years):

2019 – 83.7° /82.0°2018 – 89.9° /88.0°2017 – 94.0° /93.7°2016 – 75.1° / 73.7°2015 – 62.9° / 60.3°2014 – 65.8° / 65.9°

Note that search engine rankings are naturally noisy, and our error measurements tend to be large (making day-to-day changes hard to interpret). The difference from 2015 to 2017, however, is clearly significant.

Are there really 9 updates per day?

No, there are only 8.86 – feel better? Ok, that’s probably not what you meant. Even back in 2009, Matt Cutts said something pretty interesting that seems to have been lost in the mists of time…

“We might batch [algorithm changes] up and go to a meeting once a week where we talk about 8 or 10 or 12 or 6 different things that we would want to launch, but then after those get approved … those will roll out as we can get them into production.”

In 2016, I did a study of algorithm flux that demonstrated a weekly pattern evident during clearer episodes of ranking changes. From a software engineering standpoint, this just makes sense — updates have to be approved and tend to be rolled out in batches. So, while measuring a daily average may help illustrate the rate of change, it probably has very little basis in the reality of how Google handles algorithm updates.

Do all of these algo updates matter?

Some changes are small. Many improvements are likely not even things we in the SEO industry would consider “algorithm updates” — they could be new features, for example, or UI changes.

As SERP verticals and features evolve, and new elements are added, there are also more moving parts subject to being fixed and improved. Local SEO, for example, has clearly seen an accelerated rate of change over the past 2-3 years. So, we’d naturally expect the overall rate of change to increase.

A lot of this is also in the eye of the beholder. Let’s say Google makes an update to how they handle misspelled words in Korean. For most of us in the United States, that change isn’t going to be actionable. If you’re a Korean brand trying to rank for a commonly misspelled, high-volume term, this change could be huge. Some changes also are vertical-specific, representing radical change for one industry and little or no impact outside that niche.

On the other hand, you’ll hear comments in the industry along the lines of “There are 3,000 changes per year; stop worrying about it!” To me that’s like saying “The weather changes every day; stop worrying about it!” Yes, not every weather report is interesting, but I still want to know when it’s going to snow or if there’s a tornado coming my way. Recognizing that most updates won’t affect you is fine, but it’s a fallacy to stretch that into saying that no updates matter or that SEOs shouldn’t care about algorithm changes.

Ultimately, I believe it helps to know when major changes happen, if only to understand whether rankings shifted due something we did or something Google did. It’s also clear that the rate of change has accelerated, no matter how you measure it, and there’s no evidence to suggest that Google is slowing down.

Appendix A: Update count sources

2009 – Google’s Matt Cutts, video (Search Engine Land)
2010 – Google’s Eric Schmidt, testifying before Congress (Search Engine Land)
2012 – Google’s “How Search Works” page (Internet Archive)
2013 – Google’s Amit Singhal, Google+ (Search Engine Land)
2016 – Google’s “How Search Works” page (Internet Archive)
2017 – Unnamed Google employees (CNBC)
2018 – Google’s “How Search Works” page (Google.com)

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

The New Moz Local Is on Its Way!

Posted by on May 14, 2019 in SEO Articles | Comments Off on The New Moz Local Is on Its Way!

Posted by MiriamEllis

Exciting secrets can be so hard to keep. Finally, all of us at Moz have the green light to share with all of you a first glimpse of something we’ve been working on for months behind the scenes. Big inhale, big exhale…

Announcing: the new and improved Moz Local, to be rolled out beginning June 12!

Why is Moz updating the Moz Local platform?

Local search has evolved from caterpillar to butterfly in the seven years since we launched Moz Local. I think we’ve spent the time well, intensively studying both Google’s trajectory and the feedback of enterprise, marketing agency, and SMB customers.

Your generosity in telling us what you need as marketers has inspired us to action. Over the coming months, you’ll be seeing what Moz has learned reflected in a series of rollouts. Stage by stage, you’ll see that we’re planning to give our software the wings it needs to help you fully navigate the dynamic local search landscape and, in turn, grow your business.

We hope you’ll keep gathering together with us to watch Moz Local take full flight — changes will only become more robust as we move forward.

What can I expect from this upgrade?

Beginning June 12th, Moz Local customers will experience a fresh look and feel in the Moz Local interface, plus these added capabilities:

New distribution partners to ensure your data is shared on the platforms that matter most in the evolving local search ecosystemListing status and real-time updates to know the precise status of your location data Automated detection and permanent duplicate closure, taking the manual work out of the process and saving you significant timeIntegrations with Google and Facebook to gain deeper insights, reporting, and management for your location’s profilesAn even better data clean-up process to ensure valid data is formatted properly for distributionA new activity feed to alert you to any changes to your location’s listingsA suggestion engine to provide recommendations to increase accuracy, completeness, and consistency of your location data

Additional features available include:

Managing reviews of your locations to keep your finger on the pulse of what customers are sayingSocial posting to engage with consumers and alert them to news, offers, and other updatesStore locator and landing pages to share location data easily with both customers and search engines (available for Moz Local customers with 100 or more locations)

Remember, this is just the beginning. There’s more to come in 2019, and you can expect ongoing communications from us as further new feature sets emerge!

When is it happening?

We’ll be rolling out all the new changes beginning on June 12th. As with some large changes, this update will take a few days to complete, so some people will see the changes immediately while for others it may take up to a week. By June 21st, everyone should be able to explore the new Moz Local experience!

Don’t worry — we’ll have several more communications between now and then to help you prepare. Keep an eye out for our webinar and training materials to help ensure a smooth transition to the new Moz Local.

Are any metrics/scores changing?

Some of our reporting metrics will look different in the new Moz Local. We’ll be sharing more information on these metrics and how to use them soon, but for now, here’s a quick overview of changes you can expect:

Profile Completeness: Listing Score will be replaced by the improved Profile Completeness metric. This new feature will give you a better measurement of how complete your data is, what’s missing from it, and clear prompts to fill in any lacking information.Improved listing status reporting: Partner Accuracy Score will be replaced by improved reporting on listing status with all of our partners, including continuous information about the data they’ve received from us. You’ll be able to access an overview of your distribution network, so that you can see which sites your business is listed on. Plus, you’ll be able to go straight to the live listing with a single click.Visibility Index: Though they have similar names, Visibility Score is being replaced by something slightly different with the new and improved Visibility Index, which notates how the data you’ve provided us about a location matches or mismatches your information on your live listings.New ways to measure and act on listing reach: Reach Score will be leaving us in favor of even more relevant measurement via the Visibility Index and Profile Completeness metrics. The new Moz Local will include more actionable information to ensure your listings are accurate and complete.
Other FAQs

You’ll likely have questions if you’re a current Moz Local customer or are considering becoming one. Please check out our resource center for further details, and feel free to leave us a question down in the comments — we’ll be on point to respond to any wonderings or concerns you might have!

Head to the FAQs

Where is Moz heading with this?

As a veteran local SEO, I’m finding the developments taking place with our software particularly exciting because, like you, I see how local search and local search marketing have matured over the past decade.

I’ve closely watched the best minds in our industry moving toward a holistic vision of how authenticity, customer engagement, data, analysis, and other factors underpin local business success. And we’ve all witnessed Google’s increasingly sophisticated presentation of local business information evolve and grow. It’s been quite a ride!

At every level of local commerce, owners and marketers deserve tools that bring order out of what can seem like chaos. We believe you deserve software that yields strategy. As our CEO, Sarah Bird, recently said of Moz,

“We are big believers in the power of local SEO.”

So the secret is finally out, and you can see where Moz is heading with the local side of our product lineup. It’s our serious plan to devote everything we’ve got into putting the power of local SEO into your hands.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

I/O announcements have some applauding and others shaking their fists

Posted by on May 13, 2019 in SEO Articles | Comments Off on I/O announcements have some applauding and others shaking their fists

Now that this year’s I/O conference is in the books, digital marketers have had a chance to digest Google’s big announcements. Chief among them was Googlebot getting pushed to the latest version of Chromium, Assistant delivering results up to 10 times faster and, perhaps the most contentious, Search supporting FAQ and How-to structured data. As you can imagine, reactions weren’t limited to applause from the live audience.

Googlebot’s long-awaited update has engineers and developers nodding favorably.

Super cool. No more testing in Chrome webmaster tools to verify if your site is crawlable.

— Samar Panda (@samarpanda) May 8, 2019

Biggest unsung news 📰 of #io19…Googlebot now indexes the web using the latest Chromium rather than super old Chrome 42. Use modern features with confidence, without SEO issues. Huge! 🙌 pic.twitter.com/VJWjw71MyP

— Eric Bidelman (@ebidel) May 7, 2019

Still, some are keen to point out that this should have come sooner, especially because the update benefits businesses, consumers and Google itself.

it only took half a decade ++ !!

— jameschurchman (@jameschurchman) May 7, 2019

A number of Google Assistant-related announcements were made, but the speed demonstration is what might get users to take advantage of it more often and, by extension, businesses to prioritize integrating with it. Naturally, people drew comparisons with the competition.

Here’s an incredible demo from Google I/O. What they’re doing with the new Google Assistant is light-years ahead of Siri, which is a shame given the couple year head start Apple had. pic.twitter.com/eJiXv4SI7m

— Mike (@ekimgary) May 8, 2019

Siri: We now have better Maps integration
Bixby: We can now recognize multiple voices

Google Assistant: We’ll read your email, find the last time you booked a certain car, then book the same one in the same color for your next trip, which we also put in your calendar……..

— Marques Brownlee (@MKBHD) May 7, 2019

The announcement of support for How-to markup in search results received strong reactions. Some were excited to give it a test drive…

Actually pretty excited about this. Will definitely test out ASAP.

— Alexander Juul (@AlexanderJuul) May 8, 2019

…while others were anxious about what it could mean for the industry.

Hurray for more ways to get less traffic to your website and generate free content for @Google to run ads against !

— PaulsSEOstuff (@PaulsSEOstuff) May 8, 2019

Google is becoming a parasite. Not the mutually beneficial kind either, just a leech. You produce nothing, steal content, to make $$ & now even steal the click.

This can’t and won’t go on forever.

— Kristine Schachinger (@schachin) May 9, 2019

“Google started adding ‘features’ to the SERPS. Features whose content is not created by Google, but which operates off the scraped content of the sites in their index,” Schachinger, the digital strategist and SEO consultant quoted above, elaborated in a follow-up with Search Engine Land.

“These features ‘steal the click’ meant for the site because they are meant to keep people on Google’s page, so they will click on Google Ads. Despite a recent study showing users still, by majority, prefer the ten blue links the how-to feature shows these features are just becoming more (and not less) prevalent. The ten blue links now appear, on average, 1000px down the page, where previously they appeared between 300-400px.”

“In ‘stealing the click,’ Google is only benefiting its bottom line. And for those whose content they are using to do this, it fundamentally alters the previously beneficial relationship between Google and site owners,” she points out. “What happens to their business when site owners start putting their money and efforts elsewhere? And this is not just supposition, I can tell you I know of some enterprise level C Suites that are testing just this, right now, because of the perception that Google is becoming less and less beneficial.”

Adding to the assortment of reactions, some see structured data (such as How-to markup) as an opportunity to gain more visibility by leapfrogging the top organic search results. Others, like Greg Finn, digital marketer and partner at Cypress North, acknowledges that the change does convenience users.

“On one hand, users should benefit in the immediate future by having Google surface every bit of helpful content on a site and showing it directly in the search results. Better yet, webmasters that participate may see a boost as they put themselves into the position of offering better content for Google.”

“The other hand is the scarier one,” Finn admits. “One way to look at it is that they are cutting out the middleman, with the middleman being the website itself. Many of the examples shown simply won’t drive traffic. Take a look at the FAQs and the ‘How To Tie a Tie’ example specifically. There is a monumental downside to Google Search changes that bypass your site & your work, so be careful. Make sure you know who is benefiting on your markup. When websites lose visitors & income, the overall content and output inevitably become worse. That’s my fear here.”

Why we should care. Google has been introducing numerous products and features that insert itself between businesses and users under the guise of getting users the info they want faster. The problem is businesses aren’t necessarily seeing the benefits but Google still stands to gain.

Clicks are becoming more scarce, and that’s an indicator that potential customers are getting less contact with our brands. By investing resources and embracing these new features and markups, are we facilitating search engines at our own expense? If that’s the case, at some point brands are bound to get fed up and seek alternative routes to their audiences, or the search engines will have to offer us more for our efforts and ad budgets.

The post I/O announcements have some applauding and others shaking their fists appeared first on Search Engine Land.

The Future of Display Advertising

Posted by on May 13, 2019 in SEO Articles | Comments Off on The Future of Display Advertising

The Future of Display Advertising

Display is a key tool in the digital marketing playbook. But the landscape is rapidly changing, as emerging adtech formats – including in-banner video, dynamic creative and mobile optimization – help marketers achieve greater efficiencies and improved display results.

Are you ready to leverage these new opportunities?

Join our display advertising experts as they discuss new display best practices that can lift both brand awareness and bottom-line conversions. You’ll hear how you can effectively adopt emerging technologies to create more personalized, relevant display ad campaigns.

Register today for “The Future of Display Advertising: New marketing strategies to boost results,” produced by Digital Marketing Depot and sponsored by Bannerflow.

The post The Future of Display Advertising appeared first on Search Engine Land.