Blog

Microsoft Bing drops anonymous sitemap submission due to spam issues

Posted by on May 13, 2022 in SEO Articles | Comments Off on Microsoft Bing drops anonymous sitemap submission due to spam issues

Microsoft Bing drops anonymous sitemap submission due to spam issues

Microsoft Bing will no longer accept XML sitemaps that were submitted anonymously through HTTP requests, Fabrice Canel from Microsoft Bing announced today. The reason Bing will no longer allow anonymous sitemap submission is due to abuse by search spammers.

How anonymous sitemap submission worked. Microsoft explained that since the inception of XML sitemaps with Bing, Bing allowed for anonymous submission through using a HTTP request, such as http://www.bing.com/ping?sitemap=http%3A%2F%2Fwww.example.com/sitemap.xml.

No longer supported. Microsoft is doing away with HTTP request anonymous submission of Sitemaps starting today, May 13, 2022. Fabrice Canel wrote that Microsoft Bing is “deprecating the ability for anonymous sitemap submissions starting today.”

Why is it going away. SEO and search spammers have been abusing the anonymous submission of Sitemaps and thus, Microsoft is no longer going to support it. Fabrice Canel wrote “recent evaluations have shown that it was often subject to misuse by spammers.”

How to submit a sitemap to Bing. You can still submit your sitemaps using robots.txt on your domain name and/or through Bing Webmaster tools.

robots.txt: Add a reference to your sitemap in the robots.txt file located at the root of the host to inform all search engines. Example: Sitemap: http://www.example.org/sitemap.xml

Bing webmaster tools: Alternatively, you can submit your sitemaps in Bing Webmaster tools https://www.bing.com/webmasters/sitemaps

Don’t forget IndexNow. Microsoft Bing of course would still love for you to push content to them using IndexNow, so you have that as an additional and supplemental option.

Why we care. If you have been using the anonymous XML sitemap option for your site through HTTP requests, you should be aware that will stop working sometime today. Switch over to using robots.txt or Bing Webmaster Tools for submitting your sitemap going forward.

The post Microsoft Bing drops anonymous sitemap submission due to spam issues appeared first on Search Engine Land.

10 Best Coursera Certificates Worth Pursuing in 2022 (Reviews)

Posted by on May 13, 2022 in SEO Articles | Comments Off on 10 Best Coursera Certificates Worth Pursuing in 2022 (Reviews)

Coursera partners with top companies and universities to offer accredited certifications covering a wide range of subjects. One of the most important advantages of a Coursera certificate is that it provides students with job-ready skills and resources to help them apply for related jobs. We did our research to find the best Coursera certificates that […]

The post 10 Best Coursera Certificates Worth Pursuing in 2022 (Reviews) appeared first on reliablesoft.net.

Google Search Console adds translated results search appearance filter

Posted by on May 13, 2022 in SEO Articles | Comments Off on Google Search Console adds translated results search appearance filter

Google Search Console adds translated results search appearance filter

Google Search Console had added “translated results” to the search appearance filter in the performance reports today. This filter lets you drill down to how searchers who get translated results interact with your content in Google Search.

How to access this filter. You can access this filter by logging into Google Search Console, clicking on your performance report and then clicking on the “+new” filter and selecting the search appearance for “translated results.” Here is a screenshot of how to access the filter:

Translated results report. This report will break down what queries, pages, countries, devices and so on these searchers are using when finding your content in Google Search. Again, when you filter by “translated results,” this will filter the report to show you only those who get translated results. Here is a screenshot of this report:

More on translated results. Searchers who see translated results will be those who are likely based somewhere in India and speak Indonesian, Hindi, Kannada, Malayalam, Tamil, Telugu languages. Google automatically will translate your title link and description it shows in the search results for your site’s snippet. When that user clicks on the result, Google Translate will also likely translate your page as well.

We covered this in more detail when Google released its help document on translated results last year.

As we covered there, the translated results help document explains how Google may automatically translate the search result snippets from the language it was written in, to the language of the Google Search results page. Google said “sometimes Google may translate the title link and snippet of a search result for results that aren’t in the language of the search query.”Google said it does this because “a translated result is a Google Search feature that enables users to view results from other languages in their language, and can help publishers reach a larger audience.”

These translated results work for Indonesian, Hindi, Kannada, Malayalam, Tamil, Telugu languages at the time this story was published. It should be only available on mobile devices with any browser that supports Google Search.

After the user clicks the translated search result link, Google said that “all further user interaction with the page is through Google Translate.” Google said you can opt-out of this through a meta robots tag notranslate. Here are more details on opting in or out of translated results.

Brodie Clark was the first to spot this and wrote “a new Search Appearance filter has surfaced in Google Search Console this month title ‘translated results’. The filter relates to situations where Google has translated the title link and snippet for a web page within Search.”

Why we care. Google is now giving us some idea of how these searchers are interacting with our site through Google Search. We can see what they are searching for, what pages they are accessing, on what dates, what devices and from which countries.

More data, more insights, helps us as site owners, publishers, content writers and SEOs.

The post Google Search Console adds translated results search appearance filter appeared first on Search Engine Land.

Google Search Console adds translated results search appearance filter

Posted by on May 13, 2022 in SEO Articles | Comments Off on Google Search Console adds translated results search appearance filter

Google Search Console adds translated results search appearance filter

Google Search Console had added “translated results” to the search appearance filter in the performance reports today. This filter lets you drill down to how searchers who get translated results interact with your content in Google Search.

How to access this filter. You can access this filter by logging into Google Search Console, clicking on your performance report and then clicking on the “+new” filter and selecting the search appearance for “translated results.” Here is a screenshot of how to access the filter:

Translated results report. This report will break down what queries, pages, countries, devices and so on these searchers are using when finding your content in Google Search. Again, when you filter by “translated results,” this will filter the report to show you only those who get translated results. Here is a screenshot of this report:

More on translated results. Searchers who see translated results will be those who are likely based somewhere in India and speak Indonesian, Hindi, Kannada, Malayalam, Tamil, Telugu languages. Google automatically will translate your title link and description it shows in the search results for your site’s snippet. When that user clicks on the result, Google Translate will also likely translate your page as well.

We covered this in more detail when Google released its help document on translated results last year.

As we covered there, the translated results help document explains how Google may automatically translate the search result snippets from the language it was written in, to the language of the Google Search results page. Google said “sometimes Google may translate the title link and snippet of a search result for results that aren’t in the language of the search query.”Google said it does this because “a translated result is a Google Search feature that enables users to view results from other languages in their language, and can help publishers reach a larger audience.”

These translated results work for Indonesian, Hindi, Kannada, Malayalam, Tamil, Telugu languages at the time this story was published. It should be only available on mobile devices with any browser that supports Google Search.

After the user clicks the translated search result link, Google said that “all further user interaction with the page is through Google Translate.” Google said you can opt-out of this through a meta robots tag notranslate. Here are more details on opting in or out of translated results.

Brodie Clark was the first to spot this and wrote “a new Search Appearance filter has surfaced in Google Search Console this month title ‘translated results’. The filter relates to situations where Google has translated the title link and snippet for a web page within Search.”

Why we care. Google is now giving us some idea of how these searchers are interacting with our site through Google Search. We can see what they are searching for, what pages they are accessing, on what dates, what devices and from which countries.

More data, more insights, helps us as site owners, publishers, content writers and SEOs.

The post Google Search Console adds translated results search appearance filter appeared first on Search Engine Land.

The anatomy of personalized search

Posted by on May 13, 2022 in SEO Articles | Comments Off on The anatomy of personalized search

The anatomy of personalized search

Buyers expect easy navigation, relevant search results, and tailored search experiences that remember their preferences and rerank products accordingly, especially for in-session browsing. Unfortunately in 2022, consumers are still experiencing null results, poor relevance and are often pointed in the wrong direction by their favorite brands. 

If you’re running an e-commerce site, internal KPIs most likely have an underlying impact on positive consumer retention, acquisition, and site revenue. All three inadvertently impact brand loyalty and therefore keep that cycle moving. 

Most of these online storefronts you manage started with a passion for the product and/or helping to solve a consumer issue. So, when improving your site’s functionality, how often are you working backward by focusing site search & product discovery efforts for your customer?

Being able to identify and implement the right AI-backed functionalities that power intentional search experiences will help your customers win, your brand’s site win and make you look like a product discovery hero.

Unlock magical experiences that foster customer loyalty online and better understand the capabilities of your site’s search engine. Continue your journey to becoming a product discovery hero by mastering the three key areas of personalized search: semantic understanding, SKU select and 1:1 and segment-based personalization. 

When you understand the anatomy of personalized search, your search bar will be able to understand your customers. It’s a win-win. 

We’re on a mission for #nomorenullresults

Get Started  >>

The post The anatomy of personalized search appeared first on Search Engine Land.

Entities and E-A-T: The role of entities in authority and trust

Posted by on May 13, 2022 in SEO Articles | Comments Off on Entities and E-A-T: The role of entities in authority and trust

The development of Google into a semantic search engine and the increasing influence of E-A-T on rankings go hand in hand.

There is a common thread of innovation and updates that Google has been following for the past 12-plus years. Here’s a timeline of key events:

The organization of data and information around entities makes it possible for Google to rank entities of the entity type Person such as authors and organizations (publishers and companies) with regard to topics according to E-A-T.

Authors, companies and publishers as entities

Content is published by people such as authors and organizations such as companies, associations and government agencies. These organizations and people are named entities.

Google increasingly arranges or organizes content around entities. Google can draw conclusions about the credibility and relevance of the document or content via the respective entity.

In the case of online content, there are usually at least two parties involved. The author/producer who created the content and the publisher or domain on which the content is published.

The author is not always a direct employee or owner of the domain. For example, in the case of a guest article, the publisher and author are not the same.

In my view of SEO, the entity classes such as organizations, products and people play a special role, as these can be evaluated via the characteristics of a brand such as authority and trust or E-A-T.

Digital representations of entities

Entities that belong to certain entity classes, such as persons or organizations, can have digital representations such as the official website (domain), social media profiles, images and Wikipedia entries. While images tend to be the visual image of the entity, especially for people or landmarks, a person’s corporate website or social media profile is the content image.

These digital representations are the central landmarks closely linked to the entity.

Google can identify this linkage primarily through external linking of the website or profiles with link texts containing the exact entity name and/or the unique click behavior in search queries with navigational or brand or person-related search intent on the URL.

It’s all about relevance, trust and authority

The credibility of author and publisher has become increasingly important for Google. The search engine came under considerable pressure because of its fake news problem. A high degree of accuracy and relevance is a top priority for Google and its users.

Through numerous core updates and the E-A-T ratings introduced in version 5.0 of the Quality Rater Guidelines as part of the PQ rating in 2015, it is clear how important the factors of relevance, trust and authority are for Google in this regard.

The Quality Rater Guidelines list the following important criteria for evaluating a website:

The Purpose of the PageExpertise, Authoritativeness, Trustworthiness: This is an important quality characteristic. Use your research on the additional factors below to inform your rating.Main Content Quality and Amount: The rating should be based on the landing page of the task URL.Website Information/information about who is responsible for the MC: Find information about the website as well as the creator of the MC.Website Reputation/reputation about who is responsible for the MC: Links to help with reputation research will be provided.

Here, the points E-A-T, transparency with regard to the operator of the website and his reputation play a role in the domain-wide evaluation.

Expertise, authoritativeness and trustworthiness are currently described as follows in the Quality Rater Guidelines:

The expertise of the creator of the MC.The authoritativeness of the creator of the MC, the MC itself, and the website.The trustworthiness of the creator of the MC, the MC itself, and the website.

From entity to digital authority and brand

If we look at the characteristics of a brand, expertise, authority and trust play a central role.

In addition to the aforementioned characteristics, popularity is also an important characteristic of a brand, although this is not necessarily the main focus for authority or expertise.

Therefore, it can be said that a brand also combines all the characteristics of authority plus a high level of awareness or popularity.

Google attaches great importance to brands and authorities when ranking websites.

As early as 2009, Google rolled out the Vince Update, which gave large brands a significant ranking advantage.

Not surprising, given this statement:

“The internet is fast becoming a ‘cesspool’ where false information thrives. Brands are the solution, not the problem. Brands are how you sort out the cesspool. Brand affinity is clearly hard-wired. It is so fundamental to human existence that it’s not going away. It must have a genetic component.”

Former Google CEO Eric Schmidt

Brands combine characteristics such as popularity, authority and reputation (i.e., trust). I see trust and authority as one of the most important criteria, in addition to document relevance in relation to the search intention, as to whether Google allows content to appear on Page 1 of search results.

Google cannot afford to place content from untrustworthy sources in the user’s field of vision, especially for YMYL topics.

As a result, many affiliate projects that haven’t bothered to build a brand have fallen flat on their face. Popularity alone only plays a limited role.

Amazon and eBay are very popular brands, but they lack authority in certain thematic areas. That’s why more specialized stores usually rank better than the big e-commerce portals.

Organize an index around entities

A semantic database is organized out of entities, their relations and attributes. Unlike a classic database, information is captured around entities and relationships can be created between entities via edges.

As already mentioned, entities can be provided with labels or information for clear identification and for better classification in the ontological or thematic context. 

Entities are increasingly becoming the central organizational element in the Google index. Insofar as search queries have an entity reference, Google can quickly access all stored information about the relevant entities and relationships to other entities via the Knowledge Graph.

Search queries without reference to entities recorded in the Knowledge Graph are handled as usual according to classic information retrieval rules. However, Google can now use NLP to identify entities not in the Knowledge Graph, provided that the search term contains an existing grammatical structure of subject, predicate and object (triples).

Screenshot from the Google NLP-API

I think that in the future, there will be an increasing exchange between the classic Google search index and the Knowledge Graph via an interface. The more entities are recorded in the Knowledge Graph, the greater the influence on the SERPs.

However, Google still faces the major challenge of reconciling completeness and accuracy.

The tool Diffbot Natural Language API Demo shows very nicely how text analysis via Natural Language Processing can be used to collect information about an entity and can be transformed into a Knowledge Graph. 

In an entity-based Index, you have the following components:

Nodes (Entities)Entity IDEntity NameEdges (Relationship between entities)AttributesDigital Representations (could be also own nodes/entities)Resources (documents, videos, audios, images, etc.)Entity Types or ClassesTopic Classes and their keyword clusters

The organizational structure around single entities might look like this:

Possible index structure for the entities Taylor Swift and Joe Alwyn

The Structure around an entity is influenced by the entity types and attributes mined over the digital representations and documents, videos and other resources Google can crawl and analyze.

So Google can connect topics and their keyword clusters with entities.

The E-A-T evaluation is also based on these resources depending on the signals I mentioned in my article 14 ways Google may evaluate E-A-T.

Non-validated entities next to Knowledge Graph

I think Google has more entities on the screen than just the ones officially recorded in the Knowledge Graph. Since the Knowledge Vault and Natural Language Processing can be used to analyze entities in search queries and content of any kind, there will be a second unvalidated database next to the Knowledge Graph. This database could contain all entities recognized as entities, assigned to a domain and an entity type, but that is not socially relevant enough for a knowledge panel.

For performance reasons, something like this would make sense, as such a repository would allow not to start from scratch again and again. I think all entities are stored there, where the information regarding correctness cannot (yet) be validated.

Thus, Google would also have the possibility to apply the explained signals to other entities besides those recorded in the Knowledge Graph to perform E-A-T evaluations.

Overview: Data Mining for the Google Knowledge Graph

Google can recognize semantic relationships between keywords, topics, entities

Since the launch of Hummingbird, Google has sought to identify, extract, and related entities.

The relationships between entities, people and topics are important to Google because this is the way they can algorithmically determine contextual relationships, the quality or strength of the relationship, and about it, authority and expertise.

Google can recognize via co-occurrences of entities and keywords with which topics entities are in context. The more frequently these co-occurrences occur, the greater the probability that a semantic relationship exists. These co-occurrences can be determined via structured and unstructured information from website content and search terms.

If the entity “Empire State Building” is often named together with the entity type “skyscraper,” there is a relationship. Thus, Google can determine the relationship between entities and entity types, topics and keywords. Google can determine the degree of relationship by the average proximity in the texts and/or the frequency of co-occurrences.

For example, Zalando is closely related to other entities such as fashion brands (e.g., Tom Tailor, Nike, Tommy Hilfiger and Marco Polo) and product groups (shoes, dresses, bikinis).

These relationships can vary in strength. Google can use the strength of these relationships to assess expertise and, above all, authority and incorporate them into the E-A-T concept.

Recognize authority and entity relevance via the domain

As already explained, the website is a digital representation of an entity. Google Keyword Planner can be used to display keywords related to a domain.

The keywords are output in a list sorted by relevance, as shown here in the example of the domain footlocker.com.

The keyword combinations in which footlocker appears together with products and topics are interesting. They show in which context users search for the brand Footlocker.

Keyword List based on Footlocker.com

If you then remove all keywords with Footlocker from the list via the filter, you get a list of generic keywords that are still sorted according to a (semantic) relevance in relation to the domain.

Keyword List based on Footlocker.com without Footlocker

Exciting? I leave it to everyone to speculate further.

In my experience, domains for these keywords and topics have it easier to rank in Google search.

What this all means for SEOs and content marketers

Brand and authority are playing an increasingly important role in search engine optimization. This ensures that SEO techniques can no longer influence search results alone. It is just as much about marketing and PR.

In addition to the well-known SEO fundamentals of ensuring crawlability, indexing control, internal linking optimization and website hygiene, it is primarily the triad of relevance, trust and authority that needs to be considered.

For findability on Google, but also in general, SEOs and online marketers should focus not only on content, link building, crawling and indexing control but also on the effects on ranking through brand building. This requires collaboration with the people responsible for branding and PR. This way, possible synergies can already be taken into account during the campaign conception.

Relate your brand to topics/products for which you want to be found

Do this in all marketing and PR activities, with a view to Google ranking. Be it marketing campaigns, marketing collaborations such as Home2go or Footlocker have been done to promote certain search query patterns.

One should try to generate cooccurrences and links from topic-related editorial environments via PR campaigns or content marketing campaigns.

In general, owning content via owned media and signals via co-occurrences or brand and domain mentions in certain topic environments can increase the authority of a brand and thus the ranking for keywords located in these environments.

The more clearly Google can identify the positioning of the company, author and publisher, the easier it will be to rank the thematically relevant content linked to this entity.

The post Entities and E-A-T: The role of entities in authority and trust appeared first on Search Engine Land.

15 Best Link Building Tools (Free & Paid)

Posted by on May 13, 2022 in SEO Articles | Comments Off on 15 Best Link Building Tools (Free & Paid)

Link building is an SEO process you cannot run without a tool. Choosing the right link building tool is crucial for finding and capturing high-quality links that can make a difference to your rankings. In this guide, we’ve reviewed and compared the best link building tools, so that you can pick the best one for […]

The post 15 Best Link Building Tools (Free & Paid) appeared first on reliablesoft.net.

Google Search Console to release new video page indexing report

Posted by on May 12, 2022 in SEO Articles | Comments Off on Google Search Console to release new video page indexing report

Google Search Console to release new video page indexing report

A new video page indexing report is coming to Google Search Console in the near future, Dikla Cohen, a Web Ecosystem Consultant at Google, announced at Google I/O today. The new report shows you a summary of all the video pages Google found while crawling and indexing your site.

Video page indexing report. The video page indexing report will be found in Google Search Console, under the “Index” tab, under “video pages.” At the time of writing this, this feature does not seem live yet – but it should be coming soon.

This report shows you a summary of all the video pages Google found while crawling and indexing your site. It will help you:

See how many video landing pages Google discovered and in how many of them a video was indexedExamine reasons for unindexed videos in landing pagesUse the list of affected video pages URLs to debug and fix issuesValidate fix to initiate recrawling of known affected URLs

What it looks like. Here are screenshots from the presentation:

Why we care. Video is an important aspect for many web sites, and these reports will help you discover how important those videos are for you related to Google Search. Google Search Console’s new video indexing reports can help you find indexing issues with your videos and how to debug those issues.

Check back to find out when this report goes live.

The post Google Search Console to release new video page indexing report appeared first on Search Engine Land.

Link building: the least favorite part of SEO

Posted by on May 12, 2022 in SEO Articles | Comments Off on Link building: the least favorite part of SEO

Link building: the least favorite part of SEO

There are plenty of reasons to love SEO. I certainly do and have since I started my SEO journey in 2007. 

But every job has unique challenges – aspects that can be frustrating, difficult, tedious or even downright painful. 

That’s why, earlier this week in the Search Engine Land newsletter, I asked readers: What is your least favorite part of SEO?

Well, we have a winner. Or loser?

It’s link building. More than 20% of respondents said link building was their least favorite part of SEO. 

Let’s dig into the results. 

Link building and outreach. It’s time-consuming. It’s tedious. And success is never guaranteed. These are a just few of the biggest complaints from SEOs about trying to build quality links that we saw from Search Engine Land readers:

“Getting backlinks – it is the equivalent of a vampire sucking your soul for very little return.”“Obviously, getting natural links is nice, but when you’re out there fishing for them, it may or may not be fruitful. It’s so time-consuming and there’s no guarantee you’ll see the results you want.”“Too much work to get any reward. It often feels pointless.”“Dealing with websites that used xyz backlinks in the past. Sometimes, I’ll inherit a client whose previous agency did so, and I end up having to clean it up because it looks bad on my end if potential clients or fellow SEOs are looking at my clients’ backlink profiles to see how we do SEO at my agency.”“There is nothing more tedious and frustrating than reaching out to website owners who are getting spammed 24-7 by all and sundry to try and build a relationship in order to promote an asset your team has spent loads of time creating to either be ignored or to have to go through the awkward process of negotiating only to end up having to chase for weeks or even months to get the link live. It’s like, ‘Head, meet brick wall…’ This is a direct result of so much bad practice out there coming back to impact people who are trying to get it right.”“It takes so much time, thought and care to build lasting links, especially at scale in the B2B space.”“It’s the most abused area of SEO.”

Google. Yes, Google. There were a range of complaints. A few were specific to Google Business Profiles:

“Way too many spam listings outranking actual, quality businesses. Legitimate reviews are being removed. If you happen to get suspended for no reason, support is a nightmare to deal with to get your listing back up. And yet, you’ve got deal with GBP, at least in local SEO, because of its prominence in the SERPs.”“Lack of control. Such an important listing for local businesses, yet so volatile. Make it paid already and give us more control and better customer service.”“Too many businesses are able to create spam websites and GBP listings, which makes it a volatile space. The world would be a better place if spammers didn’t exist. Now I have to fight spam each day to help my client be in the position they were supposed to be in.”

But our readers shared other Google-related complaints, ranging from algorithm update timing to GA 4:

“Google releasing algo updates right before the holidays.”“The metrics for the so-called “Page Experience” are so ambiguous in testing and evaluating.”“They’re trying their darndest to keep everyone in their ecosystem instead of allowing clicks through to sites. Their profits last year say it all.”“Dealing with and understanding white lies coming from Google representatives.”“It took me a while to set up Google Analytics and it’s my 24/7 go-to for monitoring traffic to my online fashion store. I’m not looking forward to moving to the next generation, GA4, especially while all the features I currently use are not yet available. I’m sure I’ll eventually adapt but it will take some time.”

Proving the worth of SEO. Have you had to convince your organization that SEO is a smart investment? The answer should be as simple as, “Have you heard of this thing called Google.” Well, now you can point them to this article: Why SEO is a great investment, not just a cost.

Defending the value of SEO shouldn’t be such a struggle anymore. It’s 2022. Yet here we are:

“People not taking SEO seriously or understanding the worth of its investment and time.”“The expectation that results will be immediate and positive every time.”“Having to work double as hard as PPC managers to prove SEO’s worth.”“Explaining to clients why an automated report they received from a cousin’s, best-friend’s former roommate has no context and should be ignored.”“The constant doubting from other ‘SEO experts’ on the marketing team is demoralizing and demotivating. It’s hard to feel confident as an SEO, because the foundations keep shifting.”“Pushback from uneducated executives or ones who have been jaded by snake-oil SEOs in the past. Makes it really tough to execute on what you know will work, especially when the proof of whether your tactics work or not can take so long in coming to validate your strategies.:“The amount of effort and time sometimes necessary for clients to implement the changes needed for them to be successful in organic search.”

More least favorite parts of SEO. Finally, a few randoms. These answers didn’t fit into any of our other buckets, but they are all valid reasons for these being called out as a least favorite part of SEO:

“Technical optimisation. Never straightforward as to what exactly is causing the issues, and more often than not you need help from a developer. It’s rewarding once the problem is solved but the process is a bit of a maze.”“Watching what appears to be a well-optimized page rise and fall sporadically in the SERPs. It’s maddening.”“The uncertainty of what actually influences SERP position. It seems like a mystery and a moving target which makes it hard to manage.”“Keyword analysis, hangs me out of my throat. Uses 5 different tools, and has trouble stopping when enough is enough.” “Reporting. Nobody reads reports sent by the SEO team. You could send the same report every month and nobody would notice it. I like reporting by exception.”“Cheap, unappreciative clients.”“Digging through tons of articles on SEO topics posturing as new insights, that actually contain old platitudes or even bad analyses, just to find those few pieces a month that actually bring some new insight.”“Trying to find unique product titles for 2,000 products made with the same metal (jewelry). When I see squiggly or zig-zag on a site, I know I am not alone!”“Identifying expertise among agencies.  When everyone starts with a similar checklist of best practice recommendations, it’s difficult to know who will be able to transition into an individualized strategy for our business.”“Dealing with developers. It’s always hard to convince them about how important stuff is and to have them added to their backlog soon.”“Keeping up with technology and UI changes in all the different platforms. I’m fine with keeping up with best practices, trends, evolving standards, strategies, and explaining them to internal and and external stakeholders. It’s adapting to all the menus and paths in all the different software and platforms in order to get things done. Every day, I’m closer to becoming my dad struggling to reprogram the VCR.”“Well… it depends.”

(Note: you can read even more answers to this same question on Twitter. Google’s John Mueller asked the same question.)

Why we care. It’s good to share our frustrations with our peers. Clearly, many of you are experiencing some similar pain points in the SEO world. Just remember, it’s completely normal to not like parts of your job all the time. That could go for certain tasks, projects, clients or co-workers. And if you love link building? We salute you!

The post Link building: the least favorite part of SEO appeared first on Search Engine Land.

You still don’t have marketing security?

Posted by on May 12, 2022 in SEO Articles | Comments Off on You still don’t have marketing security?

You still don’t have marketing security?

Why has ‘marketing security’ become a top priority for modern-day marketers?

Five years ago, if you were to ask a marketer about their security strategy, the likely response would have been sheer confusion. “Bots, proxies, data-center traffic? That’s for the security team to worry about.” In 2022, however, you’d be hard-pressed to find a marketing leader who hasn’t deployed a marketing security strategy. Today, most marketers view fake, automated and malicious traffic as a strategic threat to their operation, compromising efficiency and hurting their bottom line.

27% of website traffic is fake, and it’s killing marketing efficiency

Recent data released by CHEQ across a pool of over 12,000 of its customers revealed that 27% of all website traffic is fake, consisting of botnets, data centers, automation tools, scrapers, crawlers, proxies, click farms and fraudsters. The scale of the “Fake Web” is massive, and marketers are seeing it everywhere. Just this past Super Bowl, 17 billion ad views came from bots and fake users. On Black Friday, a third of online shoppers weren’t real. Affiliate marketers are losing $1.4 billion a year to fraud. Elon Musk recently highlighted concerns over bots overrunning social media and Spotify is reportedly suffering from its own bot problem. Wherever marketers look, the Fake Web is there, and it’s affecting their campaigns, funnels, data and revenue. 

Paid marketers without security ‘waste’ thousands of clicks on fake users

Perhaps one of the most visible issues for marketers, especially those running paid user acquisition, is Click Fraud. Bots, click farms and even competitors are draining their ad budgets and severely damaging campaign efficiency. Many advertisers suffer from thousands and even tens of thousands of fake clicks every month, amounting to a massive waste of spend. But it’s not just the wasted spend, it’s also budgets that could have otherwise gone to real paying customers who would have generated actual revenue. In fact, recent data shows that $42 billion is lost each year in revenue opportunities because of this issue. 

But the real damage begins when those fake users infiltrate your audiences

Many paid marketers use smart campaigns or audiences to group together users that have either previously shown interest in their products or services or share attributes with users who have. This is helpful for expanding the market they are addressing and reaching new potential buyers. At this point, it might not come as a surprise that bots and fake users can stand in the way of successfully executing this practice as well. When audiences become polluted with malicious human users or invalid bot traffic, marketers end up accidentally re-targeting and optimizing toward fake traffic. If marketing security measures are not put in place, the cycle can continue until audiences are overtaken by bots and no longer share any resemblance to a group of human users that have the ability and intention to convert. If clean audience segments are a priority, then, for many marketers, marketing security is as well.

Fake traffic is also one of the biggest drivers of poor lead quality

Every marketer can relate to the frustration of illegitimate looking inbound leads. Sometimes it’s a fake account or a bogus email address. Sometimes the information looks legitimate but when you research the lead you can’t find the company or individual. But whatever the case is, nothing causes more tension between sales and marketing than bogus leads that waste the sales team’s time and never convert. In fact, poor traffic quality is one the biggest drivers of marketing security adoption today, as teams look to eliminate illegitimate form fills and submissions and prevent them from polluting the sales pipeline.

But perhaps the biggest reason marketers are fighting bots is data quality

Beyond the monetary waste, budget inefficiency, polluted audiences and fake leads, there is one issue that stands above them all, which is perhaps the biggest driver of marketing security adoption – and that issue is data quality. Think about it – organizations spend so much energy, time, effort, resources and money on data management and consumption – expensive BI, analytics and reporting tools, teams of analysts, CDPs and DMPs. All of this so that they can drive better tactical decisions around landing page optimization, audiences and targeting, as well as strategic decisions around budget and channel planning, growth planning and revenue forecasting. When an average of 27% of traffic-in-funnel is fake, all that data is skewed and those decisions are severely compromised. Adding a layer of visibility to detect bots and fake users and gain transparency over their funnels, is becoming an absolute integral part of the modern-day marketer’s role.

More than anything, marketing security is being looked at as an opportunity

Marketers want to eliminate these threats to their operation, but above all, they want to drive better budget efficiency, better leads and higher revenue, and that’s the ultimate goal of marketing security. Eliminating these inefficiencies drives a healthy, clean and transparent funnel that delivers better results. And for these reasons, asking a marketer “what’s your security strategy?” in 2022, is quickly becoming an almost banal question, as Marketing Security quickly becomes an industry standard.

This article was written by Daniel Avital, chief strategy officer, and global head of marketing at CHEQ.

The post You still don’t have marketing security? appeared first on Search Engine Land.