Blog

Google Ads issue with access to video pages frontend

Posted by on May 19, 2022 in SEO Articles | Comments Off on Google Ads issue with access to video pages frontend

Google is current having access issues with video pages in Google Ads. If you currently try to access video pages in Google Ads you will likely be presented with a red banner about an error message.

Notice. Google posted this issue over here saying:

We’re aware of a problem with Google Ads affecting a majority of users. We will provide an update by May 19, 2022, 6:00 PM UTC detailing when we expect to resolve the problem. Please note that this resolution time is an estimate and may change. The affected users are able to access Google Ads, but are seeing error messages, high latency, and/or other unexpected behavior.Users trying to access Video pages in Google Ads Frontend will see an error message with the red banner.No workarounds are available at this time.

Fix coming. Google is clearly aware of the issue and working on a fix, but there is currently not estimated time on when this will be fixed.

If you notice this issue, try it again later in a few hours and move on to other areas of your Google Ads accounts.

Why we care. This is just a heads up that if you are noticing this error, you are not alone. Google has confirmed the issue and is working on a fix. We will update this story when the issue is fully resolved.

The post Google Ads issue with access to video pages frontend appeared first on Search Engine Land.

Three critical keyword research trends you must embrace

Posted by on May 19, 2022 in SEO Articles | Comments Off on Three critical keyword research trends you must embrace

Three critical keyword research trends you must embrace

30-second summary:

Exact-match keywords are useful for researching patterns and trends but not so much for optimization purposes
When optimizing for keywords, optimize for intent and solve problems, don’t just match your page to the keyword
Brand-driven keywords should be your top priority because you cannot control SERPs but you can rank assets that will drive people back to your site
Instead of focusing on keyword strings, research your niche entities and find the ways to associate your business with those through on-site content and PR/link building efforts

If you ask an SEO expert to name one SEO tactic that has changed the most over the years, they are likely to confidently answer “link building.” Some will point out to “technical tasks”, and very few will ever think of “keyword research.”

The truth is, most SEO tasks look completely different these days but few SEO experts have changed the fundamental way they do keyword research and optimize content for those keywords.

Yes, we seem to have finally left keyword density behind (unless Google forces it back) but fundamentally nothing has changed: We run keyword tools, find relevant keyword strings and use them as much as we can throughout a dedicated page.

In the meantime, Google’s understanding and treatments of keywords has changed completely.

1. Exact-match keywords are getting obsolete

Google has a long history of trying to understand search queries beyond matching word strings in them to the documents in the search index.

And they succeeded.

It started years ago with Hummingbird being first quietly introduced then officially announced in August of 2013.

Yet, few SEOs actually understood the update or realized how much of a change to everything they knew it was.

With Hummingbird Google made it clear that they were striving for a deeper understanding of searching journeys and that would ultimately fix all their problems. As they manage to know exactly what a searcher wants and learn to give them that, no fake signals or algorithm manipulations will impact their search quality.

Hummingbird was the first time Google announced they wanted to understand “things” instead of matching “strings of words.” In other words, with Hummingbird exact-match keyword strings started becoming less and less useful.

Then, after Hummingbird came BERT that helped Google to enhance its understanding of how people search. 

Image source: Google

There’s a short but pretty enlightening video on the struggles and solutions of Google engineers trying to teach the machine to understand the obvious: What is it people mean when typing a search query?

That video explains the evolution of SEO perfectly:

Context is what matters
Google is struggling, yet slowly succeeding at understanding “context, tone and intention”
Search queries are becoming less predictable as more and more people talk to a search engine they way they think
Stop words do actually add meaning, and are often crucial at changing it.

The takeaway here: Keyword research tools are still useful. They help you understand the patterns: How people tend to phrase a query when looking for answers and solutions in your niche.

But those keywords with search volume are not always what people use to research your target topic. According to Google, people search in diverse, often unpredictable ways. According to Google, on a daily basis 15% of searches are ones Google hasn’t seen before.

Every day Google encounters 15% of completely new search queries. That’s how diverse searching behaviors are.

Moving away from keyword matching, Google strives to give complete and actionable answers to the query. And that’s what your SEO strategy should be aiming at doing as well.

Whatever keyword research process you’ve been using is likely still valid: It helps you understand the demand for certain queries, prioritize your content assets and structure your site.

It’s the optimization step that is completely different these days. It is no longer enough to use that word in the page title, description and headings.

So when creating an optimization strategy for every keyword you identify:

Try to figure out what would satisfy the search intent behind that query: What is it that searcher really looking for? A list? A video? A product to buy? A guide to follow? Even slight changes in a searchable keyword string (e.g. plural vs singular) can signal a searching intent you need to be aware of.
Search Google for that query and look through search snippets: Google is very good at identifying what a searcher needs, so they generate search snippets that can give you lots of clues.

Notice how none of the high-ranking documents has that exact search query included:

Image source: Screenshot made by the author

2. Branded keywords are your priority

More and more people are using search to navigate to a website, and there are several reasons for that:

A few strongest browsers allow people search from the address bar (those include Safari on both desktop and mobile and, obviously, Google Chrome)
People are getting used to voice searching, so they just speak brand names to perform a  search.

Image source: Screenshot made by the author

In other words, your customers who likely know about your brand and are possibly ready to make a purchase – those hard-earned customers are forced to search for your brand name or for your branded query.

And what will they see?

It is astounding how many companies have no idea what comes up for their branded search, or how many customers they lose over poorly managed (or more often non-existent) in-SERP reputation management.

There are three crucial things to know about brand-driven search:

These are mostly high-intent queries: These searchers are typing your brand name intending to buy from you
These are often your existing, returning customers that tend to buy more than first-time customers
Both of the above factors make these your brands’ top priority.

And yet, you don’t have control over what people see when searching for your brand. In fact, monitoring and optimizing for those brand-driven queries is not a one-time task. It is there for as long as your brand exists.

Treat your brand name as a keyword: Expand it, optimize for it, monitor your site’s rankings
Identify deeper level problems behind your customers’ brand-driven searching patterns: What is it you can improve to solve problems behind those queries?

Image source: Screenshot made by the author

Your branded search queries should become part of your sales funnel – everything from About page to product pages and lead magnets should capture those brand-driven opportunities.

In many cases, when you see a large amount of brand-driven keywords, you may need a higher level approach, like setting up a standalone knowledge base.

3. Entities are key

Entities are Google’s way to understand this world.

Entities are all proper names out there: Places, people, brands, etc.

Google has a map of entities – called Knowledge Graph – that makes up Google’s understanding of the world.

Entities help Google understand the context and the search intent.

Image search: The beginner’s guide to semantic search

Being Google’s entity means coming up in searches where you were implied but never mentioned:

Image source: Screenshot made by the author

Through entity associations, Google knows what any search is about.

Entities should be the core of your keyword research process: What are known entities is your niche and how do you associate your brand with those entities?

Conclusion

Search engine optimization is evolving fast, so it requires an agile strategy for brands to keep up. If you are doing keyword research the old, exact-match, way, your business is about 10 years behind!

Ann Smarty is the Founder of Viral Content Bee, Brand and Community manager at Internet Marketing Ninjas. She can be found on Twitter @seosmarty.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

The post Three critical keyword research trends you must embrace appeared first on Search Engine Watch.

Everything you should know about evaluating your competitor’s backlink profile

Posted by on May 19, 2022 in SEO Articles | Comments Off on Everything you should know about evaluating your competitor’s backlink profile

Everything you should know about evaluating your competitor’s backlink profile

Competitive backlink research is one of the first steps in either building your own link-building strategy or figuring out what it takes to achieve your competitors’ organic rankings.

Links are certainly not the only ranking signal, but they are still one of the most powerful factors (if not the most powerful one).

When selecting your competitors to analyze you will likely choose those that rank particularly well for your target queries, which makes sense because you want to know what has worked for them.

There’s one important thing to keep in mind here: It’s generally best to select your peers (sites directly in your vertical or niche). In other words, stay away from large websites that play within a variety of verticals but happen to rank above you (big box stores, Wikipedia, etc.).

There’s not much you can learn from Amazon’s backlink profile, for example, apart from the fact that being a web giant is working well for them.

Likewise, there’s less to learn from your oldest competitors apart from starting early (and earning all those age and trust signals over time) is certainly a good idea. 

Instead, look for sites that have seen a recent growth in rankings to zero in on tactics working well for them. These are the types of sites you can best learn from, and this is what will make your competitive research actionable, i.e. help you build and implement your own strategy.

Once you have 2-4 competitors to analyze, make sure you rule out all the red flags you want to avoid first. In other words, start with what you don’t want to do. Filter those lower-quality and often risky links out to be able to find the best links common amongst the peers within your industry.

Step 1: Filter out red flags

When it comes to link building, too much of any questionable tactic can be detrimental but let’s get a bit more specific. Look for the following red flags:

Exact match anchor text

Are you seeing a lot of backlinks that repeat the same (or almost the same) anchor text over and over again? This is always a sign of poor and outdated link building that may get (or may have gotten) your competitors into trouble.

Very often when you see a backlink profile that is full of obviously SEO-driven links, you may also notice that the site lost visibility at some point: look for dips in organic visibility that may have happened over the years using Semrush or your favorite SEO toolset.

You cannot access their disavow file, so chances are they have gradually revived their rankings by urging Google to discount those low-quality links. But if those links haven’t caused them to lose rankings yet, chances are it will eventually happen.

It doesn’t have to be a manual penalty though: Google may be already discounting those links, so they have zero impact on their organic visibility at this point.

In either case, these are not the types of links you’d want to be after.

Outdated link building tactics

There are still quite a few outdated link-building patterns found in lots of backlink profiles out there.

These include:

Directory linksLinks from blog networksArticle directory linksContent syndication (Press releases or other)Low-quality guest posting links

Look out for links from websites that invite one to submit an article or “sponsor content” on them. Keep an eye on thin content that was obviously created for the sake of linking to your competitor. More importantly, try and see obvious patterns behind those backlinks: The same link building tactic appearing over and over throughout a backlink profile.

These links are probably also discounted by Google; none of these link types are worth your effort or investment.

Step 2: Learn from their success

Now that you know what to stay away from focus on what you can learn from your competitor’s backlink profile.

If you choose your competitors wisely based on organic visibility, there will likely be more to learn than to avoid. After all, if those sites rank well, Google obviously likes their backlink profile, or at least they are doing something right.

So, what can you learn from your competitors’ backlink profiles?

1. Your competitors’ content marketing tactics

Which content seems to work for your competitors in terms of link generation? What’s their most linked content? Have they managed to get any of their content assets viral or picked up by notable web publications? Can yours do better?

Obviously, you don’t know what happens behind the scenes of them achieving those links, but it is usually obvious when a particular content asset did extraordinarily well for generating solid backlinks.

It is usually easy to identify content that went viral and generated hundreds of links or a resource page that got cited by highly trusted websites like universities and government organizations.

Can you recreate those types of assets for your website and bring them up to date or make them better?

It is also a good idea to identify your competitor’s high-ranking content. Content that ranks on top of Google tends to bring in links naturally as bloggers and journalists use Google to find sources. Getting your articles to rank is also a link acquisition tactic bringing organic link equity on a continuous basis without you having to actively build those links through traditional outreach.

Find your competitors’ articles that rank high for searchable keywords.Check backlinks of those articles to identify if that works for them.Try and claim those rankings by creating much better content.

Keep an eye on higher-level tactics that bring your competitors rankings and links. What type of content is delivering topical links? Oftentimes these would be:

Glossaries and knowledge bases;In-depth how-to content;Statistical studies and survey results (these tend to be the most powerful), etc.

2. Your competitors’ outreach tactics

Who are your competitors reaching out to when trying to build links?

It is usually easy to tell by the type of links they are getting:

Links from news outlets come as a result of journalistic outreachTrusted links from educators (college professors, teachers, etc.) require targeted trust-bait content and outreachLinks from blogs are built through blogger outreach (and often creation of viral assets, like free tools and infographics)

Which of those links seem to dominate your competitors’ backlink profile? Knowing the answer will inspire your own link acquisition strategy and help you make more informed decisions.

3. Your competitors’ influencer marketing tactics

Who are your competitors’ content amplifiers? In other words, who are those people (authors, niche experts, etc.) behind those links your competitors are getting?

Influencer marketing is a great way to generate backlinks on many levels:

Lots of niche influencers have sites and blogs they can use to link fromInfluencers (if you choose them wisely) can drive organic links by simply sharing your content or mentioning it in their newsletters.You may be able to actively engage with influencers within your niche via interviews, podcasts, Q&As, etc.

Sometimes, influencer-based tactics are hard to track in your competitors’ backlink profiles. It is often hard to correlate a sudden surge of backlinks to your competitor’s site without knowing the root cause of the spike.

This is where well-organized social media research and listening can help your competitive backlink analysis. Search Twitter and Instagram for your competitors’ brand names to see who is talking about them and what kind of an audience is involved in listening to those messages. Tools like Keyhole (a social media analytics platform) and Milled (a newsletter archive) can help you distinguish those sources of influence and match them with your competitor’s backlink profile.

Conclusion

Competitive backlink research is often enlightening if you know what to look for.

It is no use in trying to go after each and every one of their good links, though. Instead, take a higher-level approach: What is it they are doing to generate links and how can I do the same but better?

Trying to be as good as your competitor means there’s no reason for Google to rank your site higher. You need to always strive to do better: Better content, better outreach, better promotion tools. There’s often a lot of “heavy lifting” internally to get this right, and many companies choose to hire a better link-building company in order to do it right. Whichever direction you go, staying on top of your competitor’s backlinks (and your own!) will help you earn and maintain top rankings as time goes on.

The post Everything you should know about evaluating your competitor’s backlink profile appeared first on Search Engine Land.

Lucid visibility: How a publisher broke into Google Discover in less than 30 days from launch

Posted by on May 18, 2022 in SEO Articles | Comments Off on Lucid visibility: How a publisher broke into Google Discover in less than 30 days from launch

Lucid visibility: How a publisher broke into Google Discover in less than 30 days from launch

Google Discover is one of the most sought-after traffic sources by publishers, while also being one of the most confusing from a visibility standpoint. For many, it’s an enigma. For example, some publishers I help receive millions of clicks per month, while others receive absolutely none. And a ton of traffic can turn into no traffic in a flash, just like when a broad core update rolls out. More on that soon.

Google has explained that it’s looking for sites that “contain many pages that demonstrate expertise, authoritativeness, and trustworthiness (E-A-T)” when covering which content ranks in Discover. Strong E-A-T can take a long time to build up, or so you would think. For example, when a new site launches and has no history, no links, etc., it often can take a long time to cross the threshold where it can appear in Discover (and consistently).

That’s why the case study I’m covering in this post is extremely interesting. I’ll cover a brand-new site, run by a well-known person in the tech news space, and the site broke into Discover in a blistering four weeks from its launch. It is by far the quickest I have seen a new site start ranking in Discover.

And if you’re in the SEO industry, I’m sure you’ll get a kick out of who runs the site. It’s no other than Barry Schwartz, the driving force behind a lot of the news we consume in the SEO industry. But his new site has nothing to do with SEO, Search Engine Roundtable, or Search Engine Land.

Or does it? I’ll cover more about that in the article below. Let’s jump in.

Barry’s unrelenting blogging process:

As many of you know, Barry’s work ethic is insanely strong. When he decides to do something, look out. So as you can guess, Barry took his Search Engine Roundtable blogging process and employed that for Lucid Insider, his blog dedicated to news about Lucid Motors, a car manufacturer producing the luxury electric sedan Lucid Aid. He blogs every day, with multiple posts covering what’s going on in the Lucid World.

Over the past several days, I went through most of his posts, about 150 so far, and I feel completely up-to-speed on Lucid. It’s sort of the way the SEO industry feels when reading Search Engine Roundtable. I bring this up just so you have an understanding of content production on Lucid Insider.

There are now 204 urls indexed on the site and the first article was published on March 15, 2022. I’ll come back to dates soon.

Time To Discover Visibility (TTDV)

Haven’t heard of TTDV yet? That’s because I just made it up. Lucid Insider started surfacing in Discover in just four weeks from the time the site launched. For the most part, that’s before any (consistent) strong signals could be built from an E-A-T standpoint, before earning many links, before publishing a ton of content on the topic, etc.

And since the first article broke into Discover, Lucid Insider has consistently appeared in the feed (both as articles and Web Stories). As of a few days ago, Discover has driven 8,351 clicks to the site out of 9,726 total clicks from Google. Traffic from Google Search is building but has only accounted for 14% of total clicks so far. Discover is now 86% of total clicks from Google.

Articles and Web Stories with a nudge from a friend:

After Barry launched Lucid Insider, I pinged him and said he should build some Web Stories, especially as Discover showed signs of life for Lucid Insider. I have covered Web Stories heavily since they launched and have built several of my own stories covering various SEO topics. I have seen first-hand what they can do visibility-wise in Search and in Discover.

Specifically for Discover, Google has a Web Stories carousel, which prominently displays a number of stories in a special feed treatment. So, I thought Barry could create some stories to possibly rank there. And that definitely worked to an extent. Both articles and Web Stories have ranked in Discover for Lucid Insider, although most of the Web Story traffic came from just one story. That story had a strong click through rate of nearly 8%, but it’s really the only one that drove any substantial traffic.

Here is an example of that top Web Story from Lucid Insider appearing in Discover’s story carousel. The feed treatment is killer and can drive a lot of impressions and clicks.

Google News: No Visibility At All… Until This Week!

Since Discover is often tied closely to news publishers, you would think Lucid Insider would have also appeared in Google News – but that wasn’t the case. That was until this week, though! The reporting didn’t even show up in Search Console for Google News until Monday. Sure, it’s not a lot of visibility yet, but this is a brand-new website. So Google is expanding the visibility of Lucid Insider to Google News now and in just two months. Definitely a good sign for Barry.

It’s also worth noting that Barry doesn’t even have a Google Publisher Center account set up. That doesn’t impact visibility in Google News, but most publishers set one up, since you can control several aspects of your publisher account, including site sections, logo, etc.

Search Is Growing:

Before I dig into the possibilities of why and how Lucid Insider broke into Discover so quickly, I wanted to touch on Search. Although driving much lower traffic levels as a percentage of traffic, it is growing over time and will probably continue to do so.

Barry’s blog is starting to rank for a number of queries, and even has some featured snippets already. Just like with Search Engine Roundtable, I’m confident that Lucid Insider will do well in Search. Google’s algorithms just need more time in my opinion, which is at odds with Discover’s algorithms at this point.

That’s a good segue to the next part of the post where I’ll cover the possible reasons why, and how, Lucid Insider is ranking so quickly in Discover. Join me as I travel down the rabbit hole…

E-A-T:

I won’t go in-depth about E-A-T overall, since there are many other posts you can read about that. But it’s important to understand that Google explains in its Discover documentation that it looks for sites exhibiting high levels of E-A-T when determining what should appear in the feed.

It’s also important to understand that Google has explained E-A-T is heavily influenced by links and mentions from authoritative sites.

I asked Gary about E-A-T. He said it's largely based on links and mentions on authoritative sites. i.e. if the Washington post mentions you, that's good.

He recommended reading the sections in the QRG on E-A-T as it outlines things well.@methode #Pubcon

— Dr. Marie Haynes (@Marie_Haynes) February 21, 2018

Therefore, I immediately jumped into the link profile of Lucid Insider. Remember, it has only been around since March 15, 2022. When digging into the Links report in GSC, there were only 92 links there. And most of them were from SEO-related sites and content, and not automotive content.

That’s because Barry had mentioned his new blog on Search Engine Roundtable and Search Engine Land, and those sites get copied and scraped all the time, so those links end up on many other sites focused on Search. I’m not saying those other sites are providing a ton of power or value link-wise, but it’s worth noting.

From an E-A-T standpoint, both Search Engine Roundtable and Search Engine Land are authoritative sites, but they don’t focus on Lucid Motors, electric cars, automotive news in general, etc. So it’s weird to think Google would provide a ton of value from those links over the long-term, since they aren’t topically relevant at all.

I bolded “over the long-term” because I have a feeling Google’s algorithms are still figuring things out for Lucid Insider. Longer-term, I’m not sure SEO-related links will help Lucid Insider as much, as Google’s algorithms determine more about the site, its content, focus, etc.

There are definitely a few links from Lucid forums to Lucid Insider, but not many yet… And check out Majestic’s topical trust flow, just to understand the topics Lucid Insider are associated with from an inbound links standpoint. The topics reflect an SEO blog and not a blog covering Lucid Air, at least for now.

But It’s BARRY! The Man, The Brick, The Legend

For journalists, Google can connect the dots and understand articles published across sites. Google has explained this before in a blog post about journalists and you can see it firsthand in the SERPs. For example, Google can provide an Articles carousel in the SERPs for certain journalists.

So, is Barry being the author causing Lucid Insider to break into Discover faster than another author would? Could Google really be doing that??

I’m pretty sure that’s not the case. I’ll explain more below.

Barry has an Article carousel in the SERPs, but it only contains Search Engine Land and Search Engine Roundtable articles. Lucid Insider isn’t there. The carousel shows up when you surface Barry’s Knowledge Panel when searching for “Barry Schwartz technologist”.

In addition, Barry isn’t even using Article structured data (or any structured data) to feed Google important information about the article content, including author information. And he could be helping Google connect the dots by providing author information, including author.url, which is a property of Article structured data. That is what Google recommends providing, especially for journalists that write for various news publishers.

And when checking Barry’s Wikipedia page and Knowledge Panel, there is no mention of Lucid Insider. Barry should probably hop on that, but it’s not there as of now.

And last, even using a related query for Lucid Insider doesn’t yield any results (which would show you related sites to the domain you enter). Again, this will change over time as Google understands more about the site, the content, the focus, etc.

So, I doubt Google is making the connection there based on Barry being the author. Let’s move on.

Fresh Topic, Rabid Fans, Big Opportunity:

Lucid Air is a new electric car manufacturer, so there is clearly not as much written or covered as Tesla or other car manufacturers. That could definitely be playing an important factor with Lucid Insider’s visibility in Discover. For example, there’s less content to choose from when selecting content to show in a user’s Discover feed, when that person shows an interest in Lucid.

To give some context, here is Google Trends data for interest in Lucid Motors compared to Tesla, Inc.

And here is Google Trends data showing interest in the various models (Lucid Air versus the Tesla 3 or S):

In addition, there are some serious Lucid fans out there. So those people are eager to check out news and information from various sources of Lucid information. And Discover is based on a person’s interests and activities, so travels across the Lucidsphere could be leading Discover’s algorithms to surface more Lucid information in their feed and maybe the algorithm is hungry for that information. And again, there isn’t as much content for Lucid as other topics, at least yet.

Here is the top Lucid forum showing some of the activity there. And check out the sidebar… there’s a familiar face there.

It will be interesting to see how Lucid Insider performs in Discover as more sources of information hit the web. I saw this first-hand with Web Stories in Discover. When the carousel first launched, I was early to publish a Web Story. And it received 304K impressions in Discover. That wasn’t the case when I published subsequent stories, as more and more publishers started creating Web Stories. Anyway, we’ll see how it goes as more sites cover Lucid Air.

No Structured Data, No Open Graph Tags…

I mentioned earlier that Barry isn’t using any structured data at all for his articles. Well, he’s not using Open Graph tags either. Note, Open Graph tags are not a ranking factor, but they do provide more information about each article, including which images to use when shared. And Google Discover can use that larger image provided by Open Graph tags.

When checking the various social debugging tools, you can see that Facebook is dynamically populating open graph tags via other tags its finds on the page, the Twitter card validator bombs, etc.

I’m just providing this information to show that Barry has a relatively basic setup from a technical standpoint. And he’s ranking in Discover despite not providing all the bells and whistles.

Side Note: Desktop-first Indexing. What?

Google has explained in its developer documentation that mobile-first indexing will be enabled for all new sites starting in July of 2019. Well, for Lucid Insider desktop-first indexing is enabled. Just an interesting side note I discovered (pun intended) while digging into the data and reporting.

That said, the crawl stats show mostly smartphone crawling, so maybe Google Search Console lags for displaying the correct indexing model for Lucid Insider. Again, just an interesting find.

Broad core updates and Discover impact:

I mentioned earlier that Discover visibility can be impacted by broad core updates. I have covered that heavily in my blog posts about broad core updates, in presentations covering the topic, and on Twitter. Google’s John Mueller has explained this as well and it’s in the official blog post about broad core updates.

The reason I mention this is because Lucid Insider was launched in between broad core updates (and a fresh one hasn’t launched yet). So, could Lucid Insider be seeing a surge in Discover because Google doesn’t have enough signals yet to accurately measure site quality and relevance and a broad core update hasn’t been released? As Google’s quality algorithms are refreshed with the next broad core update, will Lucid Insider disappear from Discover?

It’s totally possible. Even Barry understands that what he’s experiencing is super-fast for breaking into Discover and now Google News. Four weeks is nothing when some other sites are still not in Discover, years down the line. We’ll be watching closely as the next broad core update rolls out which will hopefully be soon. We are due, that’s for sure.

Also, John Mueller has explained in the past that some newer sites might rank very well in Search in the short-term until Google’s algorithms can pick up more signals about the site, content quality, relevance, etc. And once it does, then the site could drop, or even surge. So just like I explained above, Lucid Insider can potentially see volatility as Google picks up more signals, understands its place in the web ecosystem, etc.

Here is my tweet about that segment regarding Search. You can check out the video I linked to from the tweet where Google’s John Mueller explains more about this:

That's pretty normal for a site that's trusted by Google. I've seen pages get indexed that way within minutes of publishing and rank well immediately. Google has explained they need to estimate where it should rank to start. That can change as it learns: https://t.co/xteCwTik3L

— Glenn Gabe (@glenngabe) November 2, 2018

Summary: Will Lucid Insider’s Discover Visibility Remain Strong?

So there you have it. Lucid Insider broke into Discover extremely quickly and is driving a majority of the traffic to the site now. Search is increasing, but Discover is 86% of Google traffic as of now. Although I covered a number of areas that could be helping Barry appear and thrive in Discover, this may be short-lived, at least until the site can publish much more content, earn stronger links and mentions from automotive sites, blogs, forums, etc., and build stronger E-A-T overall.

And in true Barry-form, he will be freely sharing how the site is performing over time across Google surfaces. So stay tuned for updates on how Lucid Insider is performing in Search, Discover, and Google News. And on that note, we’re expecting a broad core update soon, so it will be extremely interesting to see how that impacts the Discover visibility for Lucid Insider.

The post Lucid visibility: How a publisher broke into Google Discover in less than 30 days from launch appeared first on Search Engine Land.

Webinar: Transform your content operations with DAM

Posted by on May 18, 2022 in SEO Articles | Comments Off on Webinar: Transform your content operations with DAM

Webinar: Transform your content operations with DAM

When it comes to promoting and selling products, content is the beginning of everything. The demand for content management is greater than ever as customers receive information across an ever-increasing number of channels.

Disorganized content workflows can be a recipe for disaster. So it’s imperative that product assets are organized, controlled, and accessible to a range of internal and external stakeholders. Join experts from McCormick & Company and Acquia as they discuss the challenges, opportunities, and lessons learned through McCormick’s DAM journey.

Register today for “Content Comes First: Transform Your Operations With DAM,” presented by Acquia.

The post Webinar: Transform your content operations with DAM appeared first on Search Engine Land.

We’ve crawled the web for 32 years: What’s changed?

Posted by on May 18, 2022 in SEO Articles | Comments Off on We’ve crawled the web for 32 years: What’s changed?

We’ve crawled the web for 32 years: What’s changed?

It was 20 years ago this year that I authored a book called “Search Engine Marketing: The Essential Best Practice Guide.” It is generally regarded as the first comprehensive guide to SEO and the underlying science of information retrieval (IR).

I thought it would be useful to look at what I wrote back in 2002 to see how it stacks up today. We’ll start with the fundamental aspects of what’s involved with crawling the web.

It’s important to understand the history and background of the internet and search to understand where we are today and what’s next. And let me tell you, there is a lot of ground to cover.

Our industry is now hurtling into another new iteration of the internet. We’ll start by reviewing the groundwork I covered in 2002. Then we’ll explore the present, with an eye toward the future of SEO, looking at a few important examples (e.g., structured data, cloud computing, IoT, edge computing, 5G),

All of this is a mega leap from where the internet all began.

Join me, won’t you, as we meander down search engine optimization memory lane.

An important history lesson

We use the terms world wide web and internet interchangeably. However, they are not the same thing. 

You’d be surprised how many don’t understand the difference. 

The first iteration of the internet was invented in 1966. A further iteration that brought it closer to what we know now was invented in 1973 by scientist Vint Cerf (currently chief internet evangelist for Google).

The world wide web was invented by British scientist Tim Berners-Lee (now Sir) in the late 1980s.

Interestingly, most people have the notion that he spent something equivalent to a lifetime of scientific research and experimentation before his invention was launched. But that’s not the case at all. Berners-Lee invented the world wide web during his lunch hour one day in 1989 while enjoying a ham sandwich in the staff café at the CERN Laboratory in Switzerland.

And to add a little clarity to the headline of this article, from the following year (1990) the web has been crawled one way or another by one bot or another to this present day (hence 32 years of crawling the web).

Why you need to know all of this

The web was never meant to do what we’ve now come to expect from it (and those expectations are constantly becoming greater).

Berners-Lee originally conceived and developed the web to meet the demand for automated information-sharing between scientists in universities and institutes around the world.

So, a lot of what we’re trying to make the web do is alien to the inventor and the browser (which Berners-Lee also invented).

And this is very relevant to the major challenges of scalability search engines have in trying to harvest content to index and keep fresh, at the same time as trying to discover and index new content.

Search engines can’t access the entire web

Clearly, the world wide web came with inherent challenges. And that brings me to another hugely important fact to highlight.

It’s the “pervasive myth” that began when Google first launched and seems to be as pervasive now as it was back then. And that’s the belief people have that Google has access to the entire web.

Nope. Not true. In fact, nowhere near it.

When Google first started crawling the web in 1998, its index was around 25 million unique URLs. Ten years later, in 2008, they announced they had hit the major milestone of having had sight of 1 trillion unique URLs on the web.

More recently, I’ve seen numbers suggesting Google is aware of some 50 trillion URLs. But here’s the big difference we SEOs all need to know:

Being aware of some 50 trillion URLs does not mean they are all crawled and indexed.

And 50 trillion is a whole lot of URLs. But this is only a tiny fraction of the entire web.

Google (or any other search engine) can crawl an enormous amount of content on the surface of the web. But there’s also a huge amount of content on the “deep web” that crawlers simply can’t get access to. It’s locked behind interfaces leading to colossal amounts of database content. As I highlighted in 2002, crawlers don’t come equipped with a monitor and keyboard!

Also, the 50 trillion unique URLs figure is arbitrary. I have no idea what the real figure is at Google right now (and they have no idea themselves of how many pages there really are on the world wide web either).

These URLs don’t all lead to unique content, either. The web is full of spam, duplicate content, iterative links to nowhere and all sorts of other kinds of web debris.

What it all means: Of the arbitrary 50 trillion URLs figure I’m using, which is itself a fraction of the web, only a fraction of that eventually gets included in Google’s index (and other search engines) for retrieval.

Understanding search engine architecture

In 2002, I created a visual interpretation of the “general anatomy of a crawler-based search engine”:

Clearly, this image didn’t earn me any graphic design awards. But it was an accurate indication of how the various components of a web search engine came together in 2002. It certainly helped the emerging SEO industry gain a better insight into why the industry, and its practices, were so necessary.

Although the technologies search engines use have advanced greatly (think: artificial intelligence/machine learning), the principal drivers, processes and underlying science remain the same.

Although the terms “machine learning” and “artificial intelligence” have found their way more frequently into the industry lexicon in recent years, I wrote this in the section on the anatomy of a search engine 20 years ago:

“In the conclusion to this section I’ll be touching on ‘learning machines’ (vector support machines) and artificial intelligence (AI) which is where the field of web search and retrieval inevitably has to go next.”

‘New generation’ search engine crawlers

It’s hard to believe that there are literally only a handful of general-purpose search engines around the planet crawling the web, with Google (arguably) being the largest. I say that because back in 2002, there were dozens of search engines, with new startups almost every week.

As I frequently mix with much younger practitioners in the industry, I still find it kind of amusing that many don’t even realize that SEO existed before Google was around.

Although Google gets a lot of credit for the innovative way it approached web search, it learned a great deal from a guy named Brian Pinkerton. I was fortunate enough to interview Pinkerton (on more than one occasion).

He’s the inventor of the world’s first full-text retrieval search engine called WebCrawler. And although he was ahead of his time at the dawning of the search industry, he had a good laugh with me when he explained his first setup for a web search engine. It ran on a single 486 machine with 800MB of disk and 128MB memory and a single crawler downloading and storing pages from only 6,000 websites!

Somewhat different from what I wrote about Google in 2002 as a “new generation” search engine crawling the web.

“The word ‘crawler’ is almost always used in the singular; however, most search engines actually have a number of crawlers with a ‘fleet’ of agents carrying out the work on a massive scale. For instance, Google, as a new generation search engine, started with four crawlers, each keeping open about three hundred connections. At peak speeds, they downloaded the information from over one hundred pages per second. Google (at the time of writing) now relies on 3,000 PCs running Linux, with more than ninety terabytes of disk storage. They add thirty new machines per day to their server farm just to keep up with growth.”

And that scaling up and growth pattern at Google has continued at a pace since I wrote that. It’s been a while since I saw an accurate figure, but maybe a few years back, I saw an estimate that Google was crawling 20 billion pages a day. It’s likely even more than that now.

Hyperlink analysis and the crawling/indexing/whole-of-the-web conundrum

Is it possible to rank in the top 10 at Google if your page has never been crawled?

Improbable as it may seem in the asking, the answer is “yes.” And again, it’s something I touched on in 2002 in the book:

From time to time, Google will return a list, or even a single link to a document, which has not yet been crawled but with notification that the document only appears because the keywords appear in other documents with links, which point to it.

What’s that all about? How is this possible?

Hyperlink analysis. Yep, that’s backlinks!

There’s a difference between crawling, indexing and simply being aware of unique URLs. Here’s the further explanation I gave:

“If you go back to the enormous challenges outlined in the section on crawling the web, it’s plain to see that one should never assume, following a visit from a search engine spider, that ALL the pages in your website have been indexed. I have clients with websites of varying degrees in number of pages. Some fifty, some 5,000 and in all honesty, I can say not one of them has every single page indexed by every major search engine. All the major search engines have URLs on the “frontier” of the crawl as it’s known, i.e., crawler control will frequently have millions of URLs in the database, which it knows exist but have not yet been crawled and downloaded.”

There were many times I saw examples of this. The top 10 results following a query would sometimes have a basic URL displayed with no title or snippet (or metadata).

Here’s an example I used in a presentation from 2004. Look at the bottom result, and you’ll see what I mean.

Google is aware of the importance of that page because of the linkage data surrounding it. But no supporting information has been pulled from the page, not even the title tag, as the page obviously hasn’t been crawled. (Of course, this can also occur with the evergreen still-happens-all-the-time little blunder when someone leaves the robots.txt file preventing the site from being crawled.)

I highlighted that sentence above in bold for two important reasons:

Hyperlink analysis can denote the “importance” of a page before it even gets crawled and indexed. Along with bandwidth and politeness, the importance of a page is one of the three primary considerations when plotting the crawl. (We’ll dive deeper into hyperlinks and hyperlink-based ranking algorithms in future installments.)Every now and again, the “are links still important” debate flares up (and then cools down). Trust me. The answer is yes, links are still important.

I’ll just embellish the “politeness” thing a little more as it’s directly connected to the robots.txt file/protocol. All the challenges to crawling the web that I explained 20 years ago still exist today (at a greater scale).

Because crawlers retrieve data at vastly much greater speed and depth than humans, they could (and sometimes do) have a crippling impact on a website’s performance. Servers can crash just trying to keep up with the number of rapid-speed requests.

That’s why a politeness policy governed on the one hand by the programming of the crawler and the plot of the crawl, and on the other by the robots.txt file is required.

The faster a search engine can crawl new content to be indexed and recrawl existing pages in the index, the fresher the content will be.

Getting the balance right? That’s the hard part.

Let’s say, purely hypothetically, that Google wanted to keep thorough coverage of news and current affairs and decided to try and crawl the entire New York Times website every day (even every week) without any politeness factor at all. It’s most likely that the crawler would use up all their bandwidth. And that would mean that nobody can get to read the paper online because of bandwidth hogging.

Thankfully now, beyond just the politeness factor, we have Google Search Console, where it’s possible to manipulate the speed and frequency of which websites are crawled.

What’s changed in 32 years of crawling the web?

OK, we’ve covered a lot of ground as I knew we would.

There have certainly been many changes to both the internet and the world wide web – but the crawling part still seems to be impeded by the same old issues.

That said, a while back, I saw a presentation by Andrey Kolobov, a researcher in the field of machine learning at Bing. He created an algorithm to do a balancing act with the bandwidth, politeness and importance issue when plotting the crawl.

I found it highly informative, surprisingly straightforward and pretty easily explained. Even if you don’t understand the math, no worries, you’ll still get an indication of how he tackles the problem. And you’ll also hear the word “importance” in the mix again.

Basically, as I explained earlier about URLs on the frontier of the crawl, hyperlink analysis is important before you get crawled, indeed may well be the reason behind how quickly you get crawled. You can watch the short video of his presentation here.

Now let’s wind up with what’s occurring with the internet right now and how the web, internet, 5G and enhanced content formats are cranking up.

Structured data

The web has been a sea of unstructured data from the get-go. That’s the way it was invented. And as it still grows exponentially every day, the challenge the search engines have is having to crawl and recrawl existing documents in the index to analyze and update if any changes have been made to keep the index fresh.

It’s a mammoth task.

It would be so much easier if the data were structured. And so much of it actually is, as structured databases drive so many websites. But the content and the presentation are separated, of course, because the content has to be published purely in HTML.

There have been many attempts that I’ve been aware of over the years, where custom extractors have been built to attempt to convert HTML into structured data. But mostly, these attempts were very fragile operations, quite laborious and totally error-prone.

Something else that has changed the game completely is that websites in the early days were hand-coded and designed for the clunky old desktop machines. But now, the number of varying form factors used to retrieve web pages has hugely changed the presentation formats that websites must target.

As I said, because of the inherent challenges with the web, search engines such as Google are never likely ever to be able to crawl and index the entire world wide web.

So, what would be an alternative way to vastly improve the process? What if we let the crawler continue to do its regular job and make a structured data feed available simultaneously?

Over the past decade, the importance and usefulness of this idea have grown and grown. To many, it’s still quite a new idea. But, again, Pinkerton, WebCrawler inventor, was way ahead on this subject 20 years ago.

He and I discussed the idea of domain-specific XML feeds to standardize the syntax. At that time, XML was new and considered to be the future of browser-based HTML.

It’s called extensible because it’s not a fixed format like HTML. XML is a “metalanguage” (a language for describing other languages which lets you design your own customized markup languages for limitless diverse types of documents). Various other approaches were vaunted as the future of HTML but couldn’t meet the required interoperability.

However, one approach that did get a lot of attention is known as MCF (Meta Content Framework), which introduced ideas from the field of knowledge representation (frames and semantic nets). The idea was to create a common data model in the form of a directed labeled graph.

Yes, the idea became better known as the semantic web. And what I just described is the early vision of the knowledge graph. That idea dates to 1997, by the way.

All that said, it was 2011 when everything started to come together, with schema.org being founded by Bing, Google, Yahoo and Yandex. The idea was to present webmasters with a single vocabulary. Different search engines might use the markup differently, but webmasters had to do the work only once and would reap the benefits across multiple consumers of the markup.

OK – I don’t want to stray too far into the huge importance of structured data for the future of SEO. That must be an article of its own. So, I’ll come back to it another time in detail.

But you can probably see that if Google and other search engines can’t crawl the entire web, the importance of feeding structured data to help them rapidly update pages without having to recrawl them repeatedly makes an enormous difference.

Having said that, and this is particularly important, you still need to get your unstructured data recognized for its E-A-T (expertise, authoritativeness, trustworthiness) factors before the structured data really kicks in.

Cloud computing

As I’ve already touched on, over the past four decades, the internet has evolved from a peer-to-peer network to overlaying the world wide web to a mobile internet revolution, Cloud computing, the Internet of Things, Edge Computing, and 5G.

The shift toward Cloud computing gave us the industry phrase “the Cloudification of the internet.”

Huge warehouse-sized data centers provide services to manage computing, storage, networking, data management and control. That often means that Cloud data centers are located near hydroelectric plants, for instance, to provide the huge amount of power they need.

Edge computing

Now, the “Edgeifacation of the internet” turns it all back around from being further away from the user source to being right next to it.

Edge computing is about physical hardware devices located in remote locations at the edge of the network with enough memory, processing power, and computing resources to collect data, process that data, and execute it in almost real-time with limited help from other parts of the network.

By placing computing services closer to these locations, users benefit from faster, more reliable services with better user experiences and companies benefit by being better able to support latency-sensitive applications, identify trends and offer vastly superior products and services. IoT devices and Edge devices are often used interchangeably.

5G

With 5G and the power of IoT and Edge computing, the way content is created and distributed will also change dramatically.

Already we see elements of virtual reality (VR) and augmented reality (AR) in all kinds of different apps. And in search, it will be no different.

AR imagery is a natural initiative for Google, and they’ve been messing around with 3D images for a couple of years now just testing, testing, testing as they do. But already, they’re incorporating this low-latency access to the knowledge graph and bringing in content in more visually compelling ways.

During the height of the pandemic, the now “digitally accelerated” end-user got accustomed to engaging with the 3D images Google was sprinkling into the mix of results. At first it was animals (dogs, bears, sharks) and then cars.

Last year Google announced that during that period the 3D featured results interacted with more than 200 million times. That means the bar has been set, and we all need to start thinking about creating these richer content experiences because the end-user (perhaps your next customer) is already expecting this enhanced type of content.

If you haven’t experienced it yourself yet (and not everyone even in our industry has), here’s a very cool treat. In this video from last year, Google introduces famous athletes into the AR mix. And superstar athlete Simone Biles gets to interact with her AR self in the search results.

IoT

Having established the various phases/developments of the internet, it’s not hard to tell that everything being connected in one way or another will be the driving force of the future.

Because of the advanced hype that much technology receives, it’s easy to dismiss it with thoughts such as IoT is just about smart lightbulbs and wearables are just about fitness trackers and watches. But the world around you is being incrementally reshaped in ways you can hardly imagine. It’s not science fiction.

IoT and wearables are two of the fastest-growing technologies and hottest research topics that will hugely expand consumer electronics applications (communications especially).

The future is not late in arriving this time. It’s already here.

We live in a connected world where billions of computers, tablets, smartphones, wearable devices, gaming consoles and even medical devices, indeed entire buildings are digitally processing and delivering information.

Here’s an interesting little factoid for you: it’s estimated that the number of devices and items connected to IoT already eclipses the number of people on earth.

Back to the SEO future

We’ll stop here. But much more to come.

I plan to break down what we now know as search engine optimization in a series of monthly articles scoping the foundational aspects. Although, the term “SEO” wouldn’t enter the lexicon for some while as the cottage industry of “doing stuff to get found at search engine portals” began to emerge in the mid-to-late 1990s. 

Until then – be well, be productive and absorb everything around you in these exciting technological times. I’ll be back again with more in a few weeks.

The post We’ve crawled the web for 32 years: What’s changed? appeared first on Search Engine Land.

GA4 isn’t all it’s cracked up to be. What would it look like to switch?

Posted by on May 18, 2022 in SEO Articles | Comments Off on GA4 isn’t all it’s cracked up to be. What would it look like to switch?

GA4 isn’t all it’s cracked up to be. What would it look like to switch?

Google Analytics is the top player when it comes to tracking website visitors. The platform’s value is reflected in its popularity, which is why it’s the market leader boasting an 86% share. But with great value comes great responsibility, and Google Analytics lacks in that department.

Designed to maximize data collection often at the expense of data privacy, Google Analytics and its mother company, Google LLC, have been on the radar of European privacy activists for some time now. Reports of questionable privacy practices by Google have led to legal action based on the General Data Protection Regulation (GDPR) that might result in a complete ban on Google Analytics in Europe.

On top of that, Google recently announced it will end support for Universal Analytics in July of 2023, forcing users to switch to Google Analytics 4 (GA4). So, if the switch must be made, why not seek a new analytics provider? There are great free and paid solutions that allow organizations to balance valuable data collection with privacy and compliance. With a GDPR-compliant analytics solution in place, your data collection becomes as it should be predictable and sustainable.  

The problem with GA4 from a user perspective

Universal Analytics’ successor is very different from what you’re familiar with. Apart from the new user interface, which many find challenging to navigate, there is a laundry list of issues with the feature set in GA4—from no bounce rate metrics to a lack of custom channel groups. Here are some of the limitations in GA4 from a user perspective that you might find frustrating.

Not-so-seamless migration

GA4 introduces a different reporting and measurement technology that is neither well understood nor widely accepted by the marketing community. There is no data or tag migration between the platforms, meaning you’d have to start from scratch. The challenge grows with the organization’s size—you can have hundreds of tags or properties to move.

Limits on custom dimensions

A custom dimension is an attribute you configure in your analytics tool to dive deeper into your data. You can then pivot or segment this data to isolate a specific audience or traffic for deeper analysis. While GA4 allows you to use custom dimensions to segment your reports, there’s a strict limit—you can only use up to 50.

Lack of custom channel grouping

Channel groupings are rule-based groupings of marketing channels and, when customized, allow marketers to check the performance of said channels efficiently. Unlike Universal Analytics, GA4 does not allow you to create custom channel groupings in the new interface, only default channel groupings.

Why Google is giving you a short deadline to make the switch to GA4

It’s startling to consider the deadline Google has left the analytics community when it comes to acting: Universal Analytics will stop processing new hits on July 1, 2023. This could be a way to motivate users to migrate more quickly. Perhaps Google was disappointed with the speed of adoption for GA4 and decided to act decisively for this next version.

Another possibility for the short deadline is that Google wants to cut costs and rid itself of technical debt associated with thousands of websites with legacy solutions installed (many of those users are not active users of the product). Since GA4 is designed to support Google’s advertising network, it guarantees more revenue than the competition.

Whatever the case, users need to prepare to move to GA4—or switch to an alternative. 

The problem with GA4 from a privacy standpoint

Google claims the new platform is designed with privacy at its core, but the privacy concerns are far from over. A lack of clear guidelines on data processing has many questioning the legality of GA4 in Europe. Here are some of the reasons that leave us to believe GA4 won’t last long in Europe.

Recent laws and regulations

Google makes it difficult to collect data in line with data protection regulations such as GDPR. This means that organizations engaged in gathering, storing and processing data about EU citizens have to adjust their policies and introduce serious technological changes to be GDPR-compliant.

One of the ​​key compliance issues with Google Analytics is that it saves user data, including information about EU residents, on U.S.-based cloud servers. As a U.S.-based technology company, Google must comply with U.S. surveillance laws, such as the Cloud Act. This legislation states that Google must disclose certain data when requested, even when that data is located outside of the U.S.

In the judgment known as Schrems II, a European court ruled that sending personal data from the EU to the U.S. via transatlantic transfers is illegal if companies can’t guarantee this data will be safe from U.S. intelligence.

Companies with an international presence must now adapt to a wide range of regulations, often with different requirements and restrictions.

Transparency

A Google guide implies data is transferred to the closest Google Analytics server hub. However, the data may be stored in a geographic location that doesn’t have adequate privacy protection to the EU. This lack of transparency poses a problem for Google and organizations using Google Analytics in the EU.

Newly introduced features in GA4 partially address this concern by allowing the first part of data collection (and anonymization) on European servers. However, data can, and most likely will, be sent to the U.S. The best thing to do is be open when it comes to collecting data from people.

With proper transparency, individuals feel a sense of safety and assurance. In return, organizations get more data because individuals now feel taken care of and have the trust needed to provide data.

Time to re-think how you handle consumers’ data

The advantage of these regulations is users’ increased consciousness about their data. This is where alternatives come in handy. They provide you with privacy features you need to comply with laws and obtain the data you want. So, thinking about making the switch to a Google Analytics alternative? Here’s what you need to know.

Addressing concerns about switching to an alternative analytics solution

A lot of users may be hesitant to make the switch. It makes sense—Google has dominated the marketplace for so long that it might feel like too big of a hassle to switch. For a marketing director or CMO to suggest using a different analytics tool and then for that tool to have even more limitations than the last would not be a good look.

You need to make an informed decision and choose the platform whose feature sets fit the organization’s needs to process user-level data while building trust with visitors. Here are the facts and myths when switching:

I’ll lose historical data.

This is a fact, but not for long. Some alternatives have developed data importers in the wake of Universal Analytics (Google Analytics v3) being deprecated.

It’s expensive and hard to switch.

This is a myth. Alternatives are built with easier user interfaces, use similar measurement methodologies, and often have solutions to help with Google Tag’s migrations.

Alternatives don’t offer demographic data. 

This is true: Google’s first-party data add sex, age group, and interests to profile data, and none of the alternatives can offer such data enrichment.

I miss some reporting capabilities.

This is false. Each alternative has unique reporting capabilities, and some are very flexible, allowing for more transformations and data exports than Universal Analytics.

It is easier to run advertising campaigns with Universal Analytics.

This is true. There is deep integration with Google Analytics and Google Ads/Google Marketing Platform, which gives access to an extensive repertoire of data.

I’ll lose my rank in Google Search.

This is a myth. Alternatives’ customers don’t report a lower rank in Google Search. Make sure your site is fast, mobile-friendly, popular (links) and with complete metadata.

The mindset to take when switching.

Marketers considering switching to a new platform need to take a new analytics mindset. We are experiencing a rapidly rising awareness that data is of value and must be protected. Since the future of marketing requires users’ consent, the vendor you choose must allow you to perform analytics in a privacy-friendly way.

Our intention with Piwik PRO Analytics Suite has always been to give clients powerful analytics capabilities along with key privacy and security features. The user interface and feature sets are similar to Universal Analytics, so marketers feel at home when switching to our platform.

Piwik PRO is geared towards both delivering valuable insights and privacy and compliance. Notably, switching to Piwik PRO excludes the privacy and compliance issues associated with Google Analytics to collect data predictably and sustainably. There’s both a free and paid plan, which allows different organizations to get an analytics service tailored to their needs.  If you’d like to learn more about Google Analytics alternatives or get more information on the Piwik PRO Analytics Suite, visit piwik.pro.

This article was written by Maciej Zawadzinski, CEO, Piwik PRO.

The post GA4 isn’t all it’s cracked up to be. What would it look like to switch? appeared first on Search Engine Land.

What to look for in a technical SEO audit

Posted by on May 18, 2022 in SEO Articles | Comments Off on What to look for in a technical SEO audit

According to Techradar, there are more than 547,200 new websites every day. Google has to crawl and store all these sites in their database, therefore occupying physical space on their servers.

The sheer volume of content available now allows Google to prioritize well-designed, fast sites and provide helpful, relevant information for their visitors.  

The bar has been raised, and if your site is slow or has a lot of jargon in the code, Google is unlikely to reward your site with strong rankings.

If you really want to jump ahead of your competitors, you have a huge opportunity to be better than them by optimizing your site’s code, speed and user experience. These are some of the most important ranking signals and will continue to be as the internet becomes more and more inundated with content.

Auditing your website’s technical SEO can be extremely dense and with many moving pieces. If you are not a developer, it may be difficult to comprehend some of these elements.  

Ideally, you should have a working knowledge of how to run an audit to oversee the implementation of technical SEO fixes. Some of these may require developers, designers, writers or editors.

Fortunately, various tools will run the audits for you and give you all the comprehensive data you need to improve your website’s technical performance.

Let’s review some of the data points that will come up, regardless of what technical SEO audit tool you use:

Structure

Crawlability: Can Google easily crawl your website, and how often?Security: Is your website secure with an HTTPS certificate?On-page SEO elements: does every page have the keyword in the title tags, meta description, filenames, and paths? Does it have the same on-page elements as sites ranking in the top 10 for your target keywords?Internal links: Does your site have internal links from other site pages? Other elements you can consider are site structure, breadcrumbs, anchor text and link sculpting.Headings: Is the primary KW in the H1? Do you have H2s with supporting keywords?Compliance issues:  Does your site’s code include valid HTML? What is the accessibility score?Images: Do your images load quickly? Are they optimized with title, keywords and srcset attribute? Do you use some new image formats such as webP and SVG?Schema and semantic Web: Are your schema tags in place and set up properly? Some schema tags that you can use include WebPage, BreadcrumbList, Organization, Product, Review, Author/Article, Person, Event, Video/Image, Recipe, FAQ and How-To.Canonicals: Do you have canonical tags in place, and are they set up properly?SiteMap: Do you ONLY have valid pages in the site map, and are redirects and 404 pages removed from the sitemap?

These are simply a few of the elements you’d want to look into that most tools will report on.  

User experience

Google has been placing more focus on ranking factors revolving around user experience. As the web collectively becomes more organized, Google is raising the bar for user experience. Focusing on user experience will ultimately increase their advertising revenue.   

You’ll want to audit the user experience of your website.

Is it fast? How quickly is the page interactive? Can it be navigated easily on mobile devices? Is the hierarchy of the site clear and intuitive?

Some of the ways of measuring this include:

Site speedWeb Core VitalsMobile-friendlinessStructured navigationIntrusive ads or interstitialsDesign

Make sure you are working with a developer that is well versed in the latest technical SEO elements and who can apply the changes required to raise your SEO performance score.

Technical SEO audit tools

Some of the most popular SEO audit tools include:

Semrush Site AuditScreaming FrogSiteBulbWebsite AuditorContentKing AppGTMetrixPingdomGoogle LighthouseGoogle Page Speed Insights 

We’ll look at a couple of these tools and the data points you can gain from them.

Semrush site audit

Once you create a project in Semrush, you can run a site audit. Your overview will look like this:

Click on the “Issues” tab, and you’ll see a detailed list of the issues that were uncovered, divided by Errors, Warnings and Notices:

If you click on an item, you’ll see a list of the pages affected by each issue.

Review these as sometimes the data points are not valid.   

Ideally, you should export the CSV for each of these issues and save them in a folder.

Screaming Frog

This desktop tool will use your computer and IP to crawl your website. Once completed, you’ll get various reports that you can download.  

Here are a couple of example reports:

This is an overview report that you can use to track technical audit KPIs.

For example, this report gives you details of the meta titles for each of your pages.

You can use the Bulk Export feature to get all of the data points downloaded into spreadsheets, which you can then add to your Audit folder.

SiteBulb

Like the others, Site Bulb will do a comprehensive crawl of your website. The benefit of this tool is that it will give you more in-depth technical information than some of the other tools.

You’ll get an Audit Score, SEO Score, and Security Score. As you implement fixes, you’ll want to see these scores increasing over time.

Google Search Console

The Index Coverage report contains a treasure trove of data that you can use to implement the fixes that Google has discovered about your site.

In the details section, you’ll see a list of the errors, and if you click through to each report, they will include the list of pages affected by each issue.

Implementing technical SEO fixes

Once you have all of your CSV exports, you can create a list of all of the issues and go through them to remove duplicate reports created by the different tools.

Next, you can assign what department each fix belongs to and the level of priority. Some may need to be tackled by your developer, others by your content team, such as rewriting duplicate titles or improving descriptions with pages with low CTR.

Here’s what your list might look like:

Each project should include notes, observations, or details about how to implement the fix. 

Most websites will have dozens of issues, so the key here is to prioritize the issues and make sure that you are continuously fixing and improving your site’s performance each month.

E-A-T Audit

It’s important that your website reflects topical authority and relevance. E-A-T means:

Expertise: Are you an expert in your field? Are your authors authoritative?Authoritativeness: Are you considered authoritative in your field by industry organizations? Do your social profiles, citations, social shares and link profile reflect this authoritativeness?Trustworthiness: Can visitors trust that your website is secure and that their data is safe? Does your site have an SSL certificate, including privacy disclaimers, refund information, contact info and credentials?

Google has an entire team of Quality Raters that manually review websites to assess them based on these parameters. Google has even published the Quality Raters E-A-T guidelines for site owners to reference.

If your website is in a YMYL (Your Money, Your Life) niche, these factors are even more important as Google attempts to protect the public from misinformation.

Analytics audit

Is your Google Analytics code working properly? Do you have the proper goals and funnels to fully understand how users navigate your site? Are you importing data from your Google Ads and Search Console accounts to visualize all of your data in Google Analytics? 

BrainLabsDigital has created a Google Analytics audit checklist that will help you review your Google Analytics account. The accompanying article will give you a straightforward and strategic approach to ensuring your Google Analytics is set up properly.

Prioritizing technical SEO fixes

Make sure you prioritize continuously improving your on-page SEO. Depending on your site, you may have a list of a dozen or a few hundred fixes. Try and determine which fixes will impact the most pages to see a greater improvement from your efforts.

It can be discouraging to see a list with 85 different technical SEO improvements. The benefit is that, as you go through these improvements, you will start seeing movement in your rankings.  Over time, you’ll want to have very few, if any, errors show up in all of your crawling tools.

If your content is relevant, targeted and well developed, and you’re receiving new, quality links every month, these technical = optimizations will become the key differentiating factors for ranking better than your competitors.

The post What to look for in a technical SEO audit appeared first on Search Engine Land.

YouTube Channel Name – 5 Tips for Choosing a Great Name

Posted by on May 18, 2022 in SEO Articles | Comments Off on YouTube Channel Name – 5 Tips for Choosing a Great Name

Choosing a name for your YouTube channel is the first step in building a successful YouTube presence. In this post, you’ll learn how to choose a channel name to stand out from the crowd, how to change your channel name (if you already have one), and the difference between a channel name and a channel […]

The post YouTube Channel Name – 5 Tips for Choosing a Great Name appeared first on reliablesoft.net.

WordPress Vs. Wix

Posted by on May 18, 2022 in SEO Articles | Comments Off on WordPress Vs. Wix

WordPress Vs. Wix

Disclosure: This content is reader-supported, which means if you click on some of our links that we may earn a commission.

WordPress outshines Wix by allowing users to create sophisticated websites and customize them to their liking.

Wix doesn’t have as much design freedom as WordPress, but it is going to be way easier for beginners to use on day one.

With WordPress and Wix serving different users, the final decision ultimately depends on your experience level and purpose in launching a website. 

WordPress or Wix: Which is Better?

WordPress is a content management system best for those who value flexibility and versatility. It doesn’t come with great functionality right out of the box, but you can customize it through thousands of themes and plugins. Get your website idea off the ground using WordPress’ powerful, flexible platform. 

Wix is best for beginners who want to whip up a basic website that does not need advanced features. It’s an entry-level tool with an intuitive drag-and-drop editor so you can churn out a website in minutes without much technical know-how. Create your own website today and let Wix do the heavy lifting. 

A Review of The Best Website Builders

Website builders provide the tools you need to spin up basic or highly-versatile websites without touching any code. But it can be challenging to decide which website builder fits your needs if you’re clueless about what to look for. 

With years of experience under my belt, I’ve learned the key differentiators you should look for when shopping for a website builder. Using this in-depth review of the top four website builders, you’ll be able to zero in on the best platform that will set up your website for future success. Both WordPress and Wix made it into the top four.

WordPress Wins

Full data ownership: With WordPress being open-source software, nobody is holding you by the neck. Hence, transferring from one host to another isn’t complicated. You own all your files and can take them to any hosting provider that supports WordPress. 

Site transfers are facilitated by the built-in WordPress Import and Export tools, features you won’t find in Wix. In addition, Wix sites are hosted exclusively on Wix’s servers. Therefore, transferring your website files to a self-hosted WordPress requires a complex, cumbersome process. 

Limitless design options: WordPress is generous to a fault when giving users the power to design their websites. Free themes are available for starters, but if you’re looking for more functionality, you can choose from over 5,000 premium third-party themes that cost anywhere between $25 and $299. 

You can also hire a developer to create a custom theme with unconventional features to make your website stand out. The best part is you can easily switch from one theme to another, unlike Wix that doesn’t allow it once your site goes live. 

No WordPress-sponsored ads: WordPress is free, but you need to purchase a domain name and a hosting plan for it to work (here’s an easy step-by-step guide for getting those). But once your website is up and running, you’re free to monetize it any way you want. Meanwhile, Wix websites are free, and most basic plans are riddled with ads you don’t control. Wix only removes these ads once you upgrade to higher premium plans. 

Unmatched customizability: An open-source software, WordPress offers great flexibility right out of the box. If you’re a programmer or somebody who knows how to code, you can tweak WordPress’s underlying codes to create a website exactly as you envision it. Beginners can also customize their WordPress sites as they please. 

From switching templates after the website goes live to improving site security and adding forms to blog posts, WordPress allows you to modify your site in ways that are impossible with Wix. 

And even if you love Wix for its drag-and-drop builder, WordPress also offers something better with third-party page builder plugins like Elementor or Divi. 

Massive selection of plugins: WordPress beats Wix for having the most number of plugins to enhance your website’s functionality. Whether you want to add a table, collect your visitors’ emails, or create a landing page, there’s always a plugin that will match your needs. There are over 55,000 free and premium plugins to choose from, so it’s easy to beef up your website without manipulating any codes. 

Robust blogging platform: Even if it has evolved into a full-blown content management system, WordPress doesn’t forget its roots and continues to be the platform of choice for all things blog related. It offers the same blogging features as Wix but takes it up a notch with advanced features you’ll only find in WordPress. 

These include a native commenting section so you can manage your readers’ comments without the need for additional plugins. You also have complete control over posts and pages’ visibility as you can set them to public, private, or password-protected. 

The new Gutenberg editor has its flaws, but the ease of adding, dragging, and dropping different elements without the need for scrolling up or down is second to none. 

Better search visibility: WordPress gives you more control, so your website is more likely to rank on search engine results. Slow website? You can improve your site’s performance by applying speed-boosting strategies. 

WordPress users can also install Yoast, the world’s most popular SEO plugin. This tool helps over five million websites improve how their articles appear on search results, insert internal links, and redirect old pages to new ones, all of which help boost their rankings in the long run. 

Ahrefs, a leading all-around SEO tool, once analyzed 6.4 million websites, and the results show that WordPress sites have higher domain authority, backlinks, and traffic than their Wix counterparts. 

Scalable ecommerce functionality: WordPress lacks built-in ecommerce features, but you can easily integrate it with a powerful ecommerce solution like WooCommerce. 

Used by over 40% of ecommerce sites, this plugin can help you display products, fulfill orders, receive payments in multiple currencies, and automatically calculate tax by region. 

To further boost its functionality, you can also install additional WooCommerce plugins. For example, you can connect your store to a print-on-demand service that will process the orders on your behalf. 

There’s also the free HubSpot for ecommerce plugin that lets you track your visitors and lead them into your sales funnel. 

WordPress Losses

No phone support available: WordPress relies on its global community of volunteers. Hence, there’s no customer service hotline that you can turn to if you encounter technical hiccups. 

However, over 39% of websites in the world are powered by WordPress, so whatever issue you’re dealing with, chances are you’ll find a troubleshooting guide in support forums, Slack channels, or YouTube. But having to go looking for solutions is annoying.

The cost of creating and maintaining a website varies: WordPress is free to download. However, it can’t stand on its own, so you need to pay for a domain name and a hosting plan before you can start a website. 

A custom domain name starts at $10 per year, while a hosting plan can start as cheap as $2.95 a month. If your website gets a lot of traffic, you need to pay more for a hosting plan to avoid downtimes with each traffic spike. 

Additional expenses include premium themes and plugins to enhance the functionality of your site. In total, you might spend anywhere between $200 to a few thousand every year. 

Steep learning curve: WordPress is not the most intuitive, and it takes time to understand it. It doesn’t come with an official walkthrough, so you have to learn everything on your own. 

WordPress’s Gutenberg editor, for instance, is not as intuitive as Wix’s drag-and-drop editor. It lacks the what-you-see-is-what-you-get (WYSWYG) framework, so you need to preview the page you’re working on to see what it will look like once published. 

DIY security and maintenance: Website upkeep is not hands-off for WordPress users. The WordPress software itself rolls out updates every once in a while, and it’s your responsibility to ensure you’re using the latest version. 

You also need to manually update themes and plugins to fix bugs and prevent them from affecting site performance. 

As for backups and security, you can handle them yourself with the help of plugins. In exchange for fixed monthly fees, you can outsource all these to a WordPress maintenance service or choose a managed WordPress host to handle everything for you. 

Wix Wins

No upfront cost: Anybody can register and create a Wix website for free. If you want more functionality and to remove the Wix ads, you can upgrade to one of the Wix website plans or the business and ecommerce plans. 

Unlike WordPress, which is 100% free but comes with extra expenses, Wix’s pricing is much more straightforward. 

You don’t have to estimate anything as the pricing page has all the details. The Combo Plan starts at $14 per month and comes with everything you need to get started. You even get a free domain name for the first year, which you have to purchase on your own if you use WordPress.  

If you want more features like chatbots or event booking, you can also add Wix apps that are either free or premium with prices ranging from $3 to $20. 

Effortless registration: Although the software itself is free, creating a website with WordPress requires the additional steps of domain registration and signing up to a hosting provider. With Wix, registration is as easy as signing up using your Facebook or Google account. Then, you can start building your website right away, either through the editor or with the help of an AI tool.

Intuitive drag-and-drop editor: Wix sacrifices flexibility so users can build websites fast with little to no learning curve. Beginners can easily customize their website templates with Wix’s drag-and-drop interface that lets you move things around and add as many features as you please. 

Unlike WordPress’s Gutenberg editor, Wix’s operates on a what-you-see-is-what-you-get (WYSWYG) paradigm, so everything you see is exactly what will appear once the site goes live. 

If you’re short on time, you can also let Wix’s Artificial Design Intelligence (ADI) tool automatically create a website based on your answers to a series of questions. 

Wide selection of free templates: Wix offers the most free pre-made templates among all website builders. With over 500 templates to choose from, it’s easy to pick one that best aligns with your brand, whether you’re a non-profit organization, a photographer, a small business, or a school. 

Each template is packed with built-in elements that you can drag and drop wherever you want. The editor also gives you the freedom to change the background image into a video, select a different color palette, change the font, and tweak other elements that matter to you. 

Handpicked in-house and third-party apps: Wix’s over 250 apps are the counterpart of WordPress’s plugins that enhance a website’s functionality. WordPress plugins may outnumber Wix’s, but you have to sift through piles of clunky ones before finding what you need. 

With Wix, however, quality matters more than quantity. Everything in the Wix App Market has already passed Wix’s guidelines, so you won’t have to perform a vetting process. In addition to that, all apps integrate well with Wix, so there’s no need to worry about incompatibility issues. 

Multiple customer support channels: Unlike WordPress that mainly relies on its support forums, Wix employs a more personal approach to customer service. 

Users can request a call-back through the Wix website so a company representative can talk to them directly. This way, they won’t have to waste time fixing the issue themselves and let a real person handle it.

If phone support is unavailable, Wix users can also reach out via email or search Wix’s online help center to find relevant articles and video tutorials. 

Easy WordPress-to-Wix migration: If you jumped on the bandwagon but soon found out that WordPress lacks the simplicity you’re looking for, Wix offers an easy way out. All you need to do is enter the WordPress blog URL in Wix’s blog import tool, and with one click, your WordPress blog will be converted into a Wix blog. 

With this import tool, you can choose only the blogs you want to move to Wix without building anything from scratch. 

Hands-off security and maintenance: The benefit of having your website hosted in a closed ecosystem like Wix is its technical team handles all maintenance and security. This means you won’t have to lift a finger to create backups, update apps, or fix security glitches. 

All updates are implemented and deployed by the in-house technical team, so you won’t even notice there’s an update taking place. In contrast, WordPress doesn’t handle any of these right out of the box, so you’re basically on your own. 

Plus, Wix sites pass the highest levels of industry security compliance, from PCI DSS for ecommerce payments to SOC Type 2, and ISO 27001, 27701, 27018, and 27017.

Managed infrastructure and site data backups: Wix also delivers even more helpful features for reliability that you don’t have to handle yourself. Their data center infrastructure is robust enough to keep your site always available, even during routine maintenance. Plus, autoscaling allows your site to handle traffic spikes without missing a beat.

And, your site data is always kept up-to-date and backed up with redundant copies kept across all of those Wix data centers. You’re never left worrying if your site is available to visitors or if you can restore it should something go terribly wrong.

Wix Losses

Pricing page lacks transparency: At first glance, Wix’s pricing page seems to show how much you’d pay for each premium plan every month. However, the prices are actually what each plan would cost if you avail of the annual subscription. 

If the Unlimited plan’s monthly cost is $12.50 per month, you’ll actually pay $150 upfront. If you opt for monthly payments, the cost is significantly higher. 

It wouldn’t have been an issue had Wix placed a toggle button through which users can compare the prices if billed monthly or annually. To be fair, there’s a disclaimer at the bottom saying that the prices displayed are for yearly subscriptions, but the text is so small you won’t notice it right away. 

Limited flexibility: Wix may be easier to use than WordPress, but that comes at the price of flexibility. 

For instance, the free templates have good enough designs for inexperienced builders but are limiting for more advanced website creators. Not to mention that you won’t be able to switch to a different template once the website goes live. 

Wix is also not open-source, so programmers and other tech-savvy users won’t be able to tinker with its underlying codes. Lastly, the over 250 apps inside its App Market can enhance the site functionality, but they pale in comparison to the thousands of WordPress plugins. 

Underwhelming blog features: Wix wasn’t created with bloggers in mind, so if written content is your website’s main attraction, choose WordPress instead. 

Although Wix offers basic blog features like categories, tagging, cover image, and post scheduling, it lacks other vital elements like native commenting. 

For readers’ comments, Wix only offers Facebook comments, which are much more vulnerable to a slew of spammers. Also, the Wix plain blog editor doesn’t have the drag-and-drop functionality of WordPress’ Gutenberg, so the formatting options are limited. 

Ad-free site not available in all plans: A free Wix website comes with Wix ads and a Wix subdomain. You won’t pay for anything, but it’s not good for branding. If you already have a custom domain, you can connect it to Wix for $4.50 a month (billed annually). However, the Wix ads remain at this level. You can only get rid of the Wix-sponsored ads if you upgrade to more expensive plans. 

Difficult to get out of: Wix websites are hosted in their infrastructure, so once you create a website with them, it’s stuck in their hosting for life. It also lacks the Import and Export function of WordPress, so moving your website files from Wix to WordPress is tricky. 

To give you an idea, your posts will be imported in the form of RSS files, while your images will need to be transferred manually. For most users, this complex procedure is enough to discourage them from making the switch. 

Inferior built-in SEO functionality: Wix is not a terrible choice if you only get direct or social media traffic on your site. But when it comes to search engine visibility, Wix lags behind WordPress. 

Wix is not up to snuff from a technical SEO standpoint as it relies on Javascript to display its URLs, making them more difficult to crawl. The Javascript also leads to code bloating, resulting in slower pages. 

Wix also doesn’t have the basic features to set up a website to SEO success like hreflang and AMP support. Users have limited control over redirects and are restricted from editing the site’s robots.txt and sitemap. Creating shorter URLs is also impossible, so you’ll be stuck with https://www.neilpatel.com/post/keyword instead of the more concise and user-friendly https://www.neilpatel.com/keyword.

Comparing The Top Website Builders

Whether you’re a tech-savvy geek or a technophobe who wants to take a stab at creating websites, there’s a website builder that meets your needs. Here are my top four recommendations: 

Wix — Best for general useWeebly  — Best for beginnersWeb.com — Best for building landing pagesShopify — Best for ecommerceWordPress — Best for content management

In terms of Wix and WordPress, if you’re a novice who needs a leg up in creating your first website, Wix can get you online fast. The ease of its drag-and-drop editor is second to none, while its wide range of free templates can give you a professional-looking website without breaking the bank. 

For high-traffic websites that generate income from content, WordPress remains the best content management system.