If you’re reading this, chances are your brand’s customer acquisition strategy relies heavily on search engine optimization (SEO). And, you want to make sure you’re not losing money by leveraging counterintuitive strategies. But, how can you identify practices that may harm your search rankings?
One consideration with SEO is Google and other search engines don’t directly reveal how their algorithm works. Instead, SEO professionals have to study patterns, run independent tests, and measure growth to deduce effective strategies. It effectively means that practices that shouldn’t have been done have found a way to creep in to “game the search engines.”
Leaders in the field have collected evidence over the years that confirms some SEO practices actually harm rankings in the long run. Some of the evidence for these findings comes directly from Google Search Central webmaster guidelines. This furthers the importance of staying away from such questionable practices.
To help you narrow down your SEO efforts so that you don’t unknowingly sabotage your own search rankings and incur manual penalties from Google, we will discuss the top 5 pervasive factors that can harm your search rankings.
It’s also worth noting that not all search rankings fluctuations are caused by something that’s wrong on your website. Things like unconfirmed Google algorithm updates can also lead to a shift in rankings.
This is why it is crucial that you’re able to pinpoint the exact problem to ensure both your time and money are spent on things that actually require fixing.
Let’s explore further how you can diagnose the problem, understand the severity of the risks, and finally, take steps to fix things that might have already gone wrong.
1. Low Quality, Manipulative Links
Ever since Google search started using the PageRank algorithm to determine search rankings, backlinks have been one of the most critical factors for winning in the Search Engine Results Pages (SERPs).
However, the Google algorithm makes a distinction between backlinks received from trustworthy sources and those obtained via link schemes. Google treats links from spammy websites or those won through “unnatural” methods, such as paid guest posts, as manipulative and low quality.
These are directly in violation of Google’s webmaster guidelines. In turn, they open your website up to the risk of being de-indexed from the Google search index or incurring a manual action.
How to Identify Low-Quality Links
Your Google Search Console site messages should show you if Google has sent you a message asking to fix those links.
If you haven’t received a message, you should navigate to Search Traffic > Links to Your Site. You can export the list and identify domains with low traffic and high spam score. Google’s own documentation should serve as a reference for links you keep and links you get rid of.
The Fix
Once you have zeroed in on any unnatural links, you can send an email to each of the webmasters, requesting them to remove the links to your website. If those low-quality links are still present after 6-8 weeks, you can disavow them, and that should take care of the problem.
If you have received a manual penalty, you need to disavow unnatural links and file for a reconsideration request from Google Search Console.
2. Content Quality and User Intent
When updating or changing content on your website, you need to make sure you have exceptional quality and relevant content that meets the search intent of the query it’s supposed to rank on.
Google defines a piece of content as “high quality” when it has the most up-to-date information, an engaging voice, social proof like reviews or testimonials, or comes from trusted resources in the field. A webpage’s security can also influence how Google and actual users view its content quality, so it’s essential to focus on it.
Then comes the search intent behind your content, which is the essence of how the Google algorithm works. It’s how Google retains its dominance as the most used search engine globally and differentiates itself from the likes of Bing, which can show less relevant search results.
Search user intent serves as the “why” behind a search query for any given target keyword. It does so by making a distinction about what a user could mean when they type in something like “coffee” in the search bar versus “coffee shops near me.”
A quick Google search will show that both keywords tend to pull up different SERP features, as well as the number of blogs versus stores that come up differ significantly.
How to Diagnose the Content Issue
If you lost ranking after making changes to content, and you have determined no other factors are at play, you can do one of the following:
- Export all the pages that are ranking from your Google Analytics dashboard and compare the content against an older backup of your website.
- Install a tool like OnWebChange that notifies you of any changes made to your website design or content (Recommended option).
The Fix
You can use your website’s old backup to rewrite the problematic text, then create new, high-quality content. You can also add missing relevant keywords to your new content and tailor the content to be more suited to the type of content Google prefers showing in SERP for the focus keywords you’re targeting.
3. Metadata
Metadata helps Googlebot broadly understands what the page is about. Changing or using a wrong title tag, in particular, can adversely impact Google’s understanding of the page’s content.
As meta tags, like title and meta description, are also displayed in SERPs, not having the correct ones in place can also affect how many clicks you get.
At large, metadata can affect both user experience as well as SEO. Making sure the page title is the most accurate description of the page’s content is key. Even changing your title tag from “SEO for enterprises” to “Enterprise SEO company” may influence how the algorithm views your page.
While meta description in itself isn’t a ranking factor, like having the correct title tag,it cannot be overlooked. Without adequate clicks, any search rankings are rendered meaningless anyway. Working according to the guidelines specified by Google for metadata is key to controlling how it appears in the search.
It’s because Google may still choose to show something else in the snippets instead of what you initially prescribed, especially if you didn’t specify the length of each metadata attribute. Ensuring you have the correct length of both title tags and meta description reduces the probability of Google SERPs truncating or changing your desired metadata for users.
How to Diagnose a Meta Data Issue
Your SEO or web developer should have a list of all changes that have been made to the metadata on the web pages recently. If they don’t, it’s time to start documenting that internally.
Next, use a tool like Screaming Frog to crawl the website so that you can have a quick overview of the new meta tags in organic search. You can compare these recent changes with an older backup of the website.
The Fix
Once you have the old and the new metadata information, start a spreadsheet and list both old and new metadata in columns next to each other. Reverting your website to the old metadata should fix the problem.
You can also add any missing relevant keywords from the old metadata to the new one. Wait for Googlebot to recrawl the page and see if this fixes the issue.
4. Manipulative SEO Tactics
Just like link schemes, there are some other tactics that SEO professionals deploy for quick wins that can easily backfire. These tactics are perceived as spam and/or attempts to manipulate the search rankings. This goes against Google’s mission of showing the most relevant search result first. Here are a few of these harmful tactics.
Keyword Stuffing
Often, SEOs stuff a page with keywords and turn the color the same as the background color to manipulate search engines. Other times, the user lands on the page and reads content that doesn’t make sense.
Some SEOs have been known to add these keywords as a list or a group in a way that’s out of context and defies standard rules of language and syntax.
Doorway Pages and Cloaking
Have you ever searched for a torrent and been directed to a spammy or pornographic website?
By using cloaking, SEOs try to manipulate search engines by showing different content to users and search engines. Once the user clicks on the search result, the page often redirects to a spammy or useless page that has nothing to do with what the user wanted to see.
Cloaking is often done in conjunction with doorway pages. SEOs create multiple pages to rank for specific keywords and use those as a “doorway” to the main content page — which may be utterly useless for the user.
Both are in clear violation of the webmaster guidelines, and Google has been deploying severe penalties to tackle their usage.
Masses of AI-generated content
According to Google’s John Mueller, all AI-generated content is against their guidelines.
“For us, these would, essentially, still fall into the category of automatically generated content which is something we’ve had in the Webmaster Guidelines since almost the beginning,” he responded during a Google Search Central SEO office-hours hangout when asked about the usage of GPT-3 powered AI writing tools.
When probed about whether Google can auto-detect the usage of AI content, he added that he couldn’t claim that. He hinted that only the human team could find and take action against AI-generated content.
Many SEOs, including some of the top names in the field, strongly believe that AI content entirely generated by AI is trash. But many do still advocate its usage for fact-checking, proofreading, editing, and refining content.
Miranda Miller of Search Engine Journal supported this view. “The Associated Press began using AI for story generation in 2014. Putting AI to work in content creation is not new, and the most important factor here is its intelligent application,” she added, pointing out that AI can help content creators overcome language and literacy barriers, improve the quality of their writing, and more.
5. Slow Site Speed
Since an announcement in 2010 on its developer hub, Google has used site speed and user experience as ranking factors. It also offers its own PageSpeed Insight tool, further supporting this claim.
In recent years, site speed has been viewed within the broader spectrum of “Ideal Page Experience,” which combines Core Web Vitals with mobile-friendliness, safe browsing, site speed, security, etc.
According to Google, this is an effort to get people the most helpful and enjoyable experience on the web.
An excerpt from the Google Search Central reads, “We encourage you to start looking at your site’s speed (the tools above provide a great starting point) — not only to improve your ranking in search engines, but also to improve everyone’s experience on the Internet.”
Besides search ranking, site speed also affects bounce rate. According to Websitesetup.org, a load time of 6 seconds can increase bounce rate by up to 106%. So in that respect, it’s worth paying attention to both for search rankings and conversions.
How to Diagnose a Site Speed Issue
A load time of 0-2 seconds is recommended. You can use the PageSpeedInsights tool or GTMatrix to check the load time of your website.
The Fix
There are many things a web developer or SEO can do to speed up your website. These include minifying the Javascript, optimizing images and videos, improving server response time, and reducing redirects.
Use the report generated by the tool you use as a starting point to start improving your site speed and Core Web Vitals.
Conclusion
SEO can be convoluted and time-consuming — so narrowing down and playing the game for the long haul is the key to getting a good ROI.
At the end of the day, Google is clear about what it’s looking for: the best answer for each searcher’s unique needs. So everything you can do to position your page as the best answer to relevant queries and avoid mistakes that counter those efforts is another step up the search rankings ladder.
If you are interested in reading more about factors that can influence search rankings positively, we recommend the following resources: