• Skip to main content
  • Skip to primary sidebar
  • Skip to footer
SwaCash | Internet Marketing News

SwaCash | Internet Marketing News

Latest Updates on Tech, Internet & Digital World

  • Home
  • Digital Marketing
  • Social Media
  • Technology
  • About
  • Contact Us
You are here: Home / Digital Marketing / Most common SEO issues found on sites with declining SEO

Most common SEO issues found on sites with declining SEO

January 26, 2021 by Amer Bekic

I constantly study many sites to understand why some see their SEO drop (sometimes Google Updates). Here are the main errors identified, the most common. Check if you do them!

I am in contact with a lot of people in SEO and I have access to a lot of sites, which allows me to do several analyzes.

I constantly assess the “SEO health” of these sites, whether their organic traffic is increasing or declining …

In this file, I will “reveal” to you the most frequently observed problems among all the sites studied whose SEO has dropped.

The idea is as follows. My advice is very simple:

If your site is affected by at least 1 of the issues below, correct it (them) completely as soon as possible. On the one hand, because it is not taken into account by Google immediately and on the other hand because the algo needs to see a significant improvement on the whole site (to get you back).

For each of these sites:

  • Additional analyzes were carried out, for criteria not managed by my tool (for example backlinks or advertising)
  • I observed that the site was affected by 1 or more of the issues listed below
  • I cannot guarantee that this is the direct cause of its drop-in SEO, but I affirm that they must first be corrected before hoping to go up

In reality, everything that is detailed here also applies to sites whose SEO is doing well but which have certain weaknesses. By correcting them, SEO could improve.

The following issues are sorted by decreasing occurrence. This means that I first indicate the most common problems encountered on all the sites I have audited.

# 1 Lack of content in the main area

This is by far the most frequent problem in the sites that I have been able to study and whose natural referencing is only falling. It’s only been 20 years that we say that the content is important, we probably have to wait before everyone understands it 🙂

OK Olivier, how many words are needed per page? 300 words is it good?

This is what I am regularly asked. This 300-word hint has been given so often that it seems to refer. However, we must understand:

  • what matters is that the text is interesting. No need to fill in just to please Google’s algo or an audit tool.
  • when we talk about several words, it is for the main part of the page. If the page is an article like here, it is the number of words in the article. Especially not all of the words on the page, including the menu, the sidebar, the additional areas, and the footer!
  • the expected quantity depends on the context and for example the type of site or page. For a file like the one you are reading, 1000 to 1500 words seems like a good goal (this one exceeds 3500). The requirements must be adapted according to the context.

# 2 Out-of-date content and false promises

As I have said several times:

The Internet user who comes from Google (natural results), Google told us. It’s up to us to take care of it! Because if we disappoint him, Google will eventually realize it. And as Google does not want to disappoint its users,  its algorithm is programmed to demote sites that disappoint Internet users too much.

I encountered in my panel a bunch of sites with 2 errors that I grouped:

  • outdated, more up-to-date content ????  : the Internet user realizes this very quickly and loses confidence in the site. “Trust” is one of the 3 pillars for solid natural referencing:  Expertise +  Authority +  Trust (expertise, authority, and trust).
  • a false promise  ????: in the SERPs, the Internet user clicks on your result because the title (and/or description) suggests that he will find what he wants. Once on the page, the reality is quite different! The worst is when the site’s goal is not to provide the information or the solution that the Internet user wanted, but to offer them to buy a product or a service, or to ask them to fill out a form (lead). Pay attention to your title (title tag): of course, it must encourage clicks but must correspond to the content of the page.

In a similar style, I also encountered the following errors:

  • a real estate agency that has been leaving outdated ads online for years. The goal is to show that in all, it has managed a large number of transactions … But if in 90% of cases the Internet user who arrives from Google comes across an expired ad, what will he do? A step backward in the SERPs, a catastrophic signal for the site …
  • a big eCommerce site that leaves a lot of products online without a buy button. These are not items not in stock, but not available for sale …
  • a site with strong editorial content with pages announcing events to come… but which have been past for years now. To delete!
  • a footer proudly displaying a copyright year 3 years late
  • a legal mentions link that does not point anywhere (the famous one <a href=”#”>)

???? Apart from testing all these scenarios, how do you spot them? For my part, I don’t know (yet) how to automatically locate these pages from their characteristics. On the other hand, by analyzing the number of visits they generate in 1 year (be careful not to take a shorter period), I spot a large number.

# 3 Insufficient quality on the page

When Google started saying that its algo (Panda in 2011) knew how to assess quality, like you I was skeptical. But I’ve changed my mind since. Firstly because I noticed that overall Google Panda knew how to do it quite well and then because at my small level I was also able to do it.

Only very low QualityRisk pages generate visits, but the problem is that there are few on the site. In short, in my opinion, and with my experience, it is urgent to strengthen the quality of these pages with a strong QualityRisk.

Among the sites of my study panel, I could find a lot of examples of pages of too low quality:

  • categories without any product (or article or advertisement): the impact is catastrophic …
  • categories with a single product (or article or advertisement), without any presentation text of the category: this is largely insufficient
  • a too weak product sheet with barely 3 lines of description
  • blog posts formatted at 300 words each, which often turn out to be tasteless
  • full of tag pages (often more than the articles themselves), without any specific content
  • (big) pages where 99% of the words are clickable
  • in a forum, (old) discussions without any response
  • in a forum, the profile pages (not presenting any useful information)

???? Action: Analyze the intrinsic quality of your pages, then improve what you can, starting with the pages with the best potential.

# 4 UX and / or catastrophic design

It’s hard to list what to avoid at all costs, as it’s often a pile of details, like:

  • the main content is only visible when you have scrolled part of the page
  • full of banners and other pop-ups or pop-ins prevent a good reading of the page
  • the mobile version is non-existent or very difficult to use
  • the design is very old, totally outdated
  • the text too often contains keyword stuffing, as well as too much bolding
  • the menu is either far too thick (mega menu), or based on labels that mean nothing to the user, or unusable (technical bug)

These problems often stack up with those related to advertising or sometimes site speed (which I discuss below).

???? Action: ask as many people as possible to go to your site (with different devices), browse and report to you.

# 5 Too artificial backlinks

The backlinks are good, excellent even if they are of quality. They have an effect of amplifying technical and above all editorial optimizations.

As long as the net linking strategies are clean!

This is where the problem arises. The little “young people” who have not known the Penguin massacres do not realize that they are playing with fire by buying or creating lots of artificial links. Others have known this era well, but feel that Google penalties on artificial backlinks are rare enough that it is worth taking advantage of “easy links”.

In short, I have seen several cases where the “artificial backlinks” component seems to have played a role in the fall of SEO. Sometimes quite brutally.

It’s always difficult to be sure by studying just one site. But when we see that several sites fall around the same time and that they share this “problem”, then it becomes much more concrete and obvious.

???? Action: So check your links, reread the instructions from Google (to see if you have crossed the line too much).

# 6 An obvious lack of Expertise, Authority, and Trust

It is in vain to say that EAT is only a concept intended for evaluators of the quality of SERPs – and not an official criterion (ranking factor) …

I see more and more cases where the site is down due to the lack of one or more of the “Expertise”, “Authority”, “Trust” components.

Depending on the business sector, I have the impression that one or the other of these concepts is more important. For example :

  • in the medical sector or tax exemption, the author’s expertise must be obvious and far superior to what we find in the first pages of Google
  • for eCommerce, trust and reputation must be at the top. It does not only mean “no negative opinion”, but it also means opinions, mentions of the brand, concrete elements that show seriousness.

# 7 Too much intrusive advertising

Monetize your site through advertising, OK ????

Invade the pages of advertisements of all kinds, which become more important than the content, no! ????

As you know, too much advertising is harmful, at least as a user. So why do you, as a site editor, forget it?

What to avoid in my opinion, in any case, the accumulation:

  • ads very well integrated into the content so that the Internet user no longer makes the difference (basically, do not follow the example of Google with its AdWords …)
  • mobile app installation pop-up
  • notification subscription pop-up
  • newsletter subscription pop-up
  • interstitial pub
  • videos that start by themselves (even worse: with sound-activated)
  • floating ad in the page, hindering navigation
  • very frequent affiliate links in the content, not always relevant
  • sites where 90% of the content is an affiliate (with outgoing links including tracking )

I complete with a short quote from Google’s guidelines for its quality reviewers (Distracting / Disruptive / Misleading Titles, Ads, and Supplementary Content):

Pages that disrupt the use of main content should be rated low. A single pop-up ad with a visible close button is not too disruptive but does not promote user experience. Here are two examples of situations that we consider disruptive:

– Ads that actively float on the main content as you scroll down the page and are difficult to close. It can be very difficult to use the main content when it is constantly covered by ads that are difficult to close.

– An interstitial page that redirects the user away from the main content without allowing them to return to it

???? What to do? Here too you must ask third parties who do not yet know your site to test it and give you their opinion. Take a step back and remove the ads that hurt the user experience the most. Do A / B tests to find out which ones to keep. Indicate “Advertising” around each location. Aim for the long term …

# 8 Pages way too deep

For once, it is a problem (a priori) easy to solve: pages much too deep. This means that to reach them from the homepage takes a lot of clicks on links.

Yet it is well known: the deeper a page, the more its SEO performance crumbles.

Conclusion: Even if this is not a quality issue, nor necessarily a big problem for the user experience, having so many pages so deep is detrimental to SEO performance. This site could raise its traffic very easily by working this lever.

???? What to do? If the cause is purely technical, correct it (for example pages that should not have existed, or JavaScript not seen by Google, or too few elements per listing page). Otherwise, reduce the depth! 

# 9 Black mass

What is the “black mass” in SEO?

This term refers to all the crawlable and sometimes indexable URLs that should not have been crawlable or indexable. They degrade the SEO performance of the site.

The expression “black mass” was coined in SEO training by Fabien Facériès, co-founder of Ranking Metrics 😉

These are the URLs that Googlebot may encounter on your site that shouldn’t even have been there, or not crawled, or not indexable. In short, these URLs disrupt SEO, consume crawl budget and in the worst case can change the perception of Google. Indeed, if among these pages most are of low quality, they will be part of Google’s perimeter of analysis and therefore lower the average rating of the site.

Here are the most common causes that create black mass, found on the sites of my panel:

  • everything related to sorting functions in categories: they often add type parameters order in the URL, which it is better to block when crawling (via the robots.txt file)
  • poor management of the ”  trailing slash “, ie the sign /at the end of the URL. Either you have it or you don’t, but not both.
  • pages that are rarely of interest (to be crawled or indexed) such as archives by author, but also often tags
  • of duplicate URLs for the item or product category includes the URL, and he or she is present in several categories, which generates duplicate content
  • URLs still in the old format, before URL rewriting
  • URLs with Google Analytics tracking (although you should not put any in internal links)
  • internal search URLs (with endless possibilities), while Google recommends not to index them
  • uppercase and lowercase URLs
  • product variants with 3 tonnes of possible values

I bet Google always responds “that’s okay, we’re managing”. In theory, yes, but in practice, it is something else. Do you need to put a spoke in its wheels?

???? What to do? Compare the list of URLs located in your comprehensive sitemap (I hope you have one) to the list of crawled and indexable URLs found by an SEO crawler (RM Tech for example).

# 10 Huge download and load times

I distinguish here 2 types of measures:

  • the download time is the time taken by a crawler (Googlebot for example) to retrieve a resource. In general, we focus on HTML, but this concerns everything that is allowed for crawling: images, CSS, Javascript, etc.
  • load time is the time that a full HTML page is displayed.

In both cases, your site must be super-fast. This is important for SEO, but even more, for the efficiency of your site, that is to say, its ability to meet its objectives (online sales, contact, advertising, etc.).

  • if the pages take too long to download, Google reduces the number of URLs crawled per day/month. This is called the budget crawl. If your pages are no longer crawled, or infrequently, the problems will pile up and you will have a hard time getting out of them.
  • if pages take too long to load, people may simply leave your site. Already it’s not terrible, but if also, they go back in Google, it’s very bad

Google recommends that server processing to produce a page take less than 200 ms (source). 

  • ???? What to do? First, diagnose, then apply the recommendations given to you by the tools: for the download, start with the curve located in Google Search Console: it gives a general trend. Complete a crawl of your site with any crawler that measures the download time. Ideally, check your images especially (to find images that are too heavy or too large for example). If you have access to them and they are properly configured, also study your logs.
  • for loading, do tests with different pages of your site, on tools like Dareboost or GTMetrix These 2 tools use others like Google PageSpeed ​​Insights and YSlow, which is practical. 

# 11 Too many non-indexable but crawlable pages

It is in a way a kind of variant of the black mass, which one could qualify as a “controlled black mass”. That is to say that the site generates a lot of URLs not intended to be referenced and that Google is “getting caught in the paws”. You offer him lots of internal links to pages and every time he crawls them, the index instruction tells him “sorry, you came for nothing, you must not index”.

No, he’s not going to get upset, he’s a bot! ????

But if it takes on huge proportions, Google’s crawl becomes ineffective on your site. Little by little, it is possible that Google will reduce its crawl on these pages in the index.

In the end, what’s the point of making him crawl all this?

Sometimes these are pages with lists of links, for example, filtered or sorted categories, or tags, or pagination. So make sub-categories or find other ways to complete your taxonomy, it will be much more effective.

???? How to correct it? First, you have to spot it: calculate the rate of indexable pages among your crawlable pages. If it is less than 95%, I advise you to review the way you manage your site and your crawl.

I encountered other errors in the sites studied, but of lesser magnitude, or on too few sites for them to be mentioned here.

Filed Under: Digital Marketing, Google

Primary Sidebar

E-mail Newsletter

More Articles

Apple’s valuation exceeds $2.5 trillion mark

July 19, 2021 By Amer Bekic

Ethereum (ETH), a white hat saves a cryptocurrency user on the verge of losing $240,000

July 19, 2021 By Amer Bekic

Project Pegasus: Israeli “clickless” spyware used against thousands of journalists and activists

July 19, 2021 By Amer Bekic

Some weird Intel Core 12900K, 12700K, and 12600K specs have appeared online

July 18, 2021 By Amer Bekic

In 2021, the video game industry still generates more investments

July 18, 2021 By Amer Bekic

Malaysian authorities destroy 1,069 mining devices (video)

July 18, 2021 By Amer Bekic

Disable the Windows print spooler or you could be hacked, according to Microsoft

July 18, 2021 By Amer Bekic

Paraguay promises to be a 100% renewable Bitcoin (BTC) mining giant

July 17, 2021 By Amer Bekic

Faced with ransomware, Interpol wants international collaboration

July 17, 2021 By Amer Bekic

AMD brings its FidelityFX Super Resolution to open source

July 17, 2021 By Amer Bekic

Failed to reserve the Valve Steam Deck? You are not the only ones

July 17, 2021 By Amer Bekic

Twitter boss prepares DeFi application platform on Bitcoin blockchain

July 16, 2021 By Amer Bekic

Here’s why the Galaxy Watch 4 could make the Apple Watch shake

July 16, 2021 By Amer Bekic

“Facebook users spied on social by dozens of employees”: the accusation in the book-survey

July 16, 2021 By Amer Bekic

Valve’s Steam Deck will be able to run Windows

July 16, 2021 By Amer Bekic

Only 12% of companies realize the full potential of the cloud

July 16, 2021 By Amer Bekic

Limited Edition RTX 3080 Gundam Cards Sold In Pre-Built Mining Rigs

July 16, 2021 By Amer Bekic

Windows on iPad: You dreamed it, Microsoft did it

July 15, 2021 By Amer Bekic

iPhone 13: anticipating record sales, Apple would have increased production by 20%

July 15, 2021 By Amer Bekic

Google Stadia is not dead and seeks to expand its catalog of games with an enticing affiliate program

July 15, 2021 By Amer Bekic

Galaxy Watch 4: released on August 27, 2021, Amazon confirms!

July 15, 2021 By Amer Bekic

Apple seriously threatens to exit UK market

July 15, 2021 By Amer Bekic

Razer upgrades Blade 15 and Blade 17 to 11th gen Intel processors

July 14, 2021 By Amer Bekic

British detectives seize record $250 million in crypto

July 14, 2021 By Amer Bekic

OnePlus Nord 2: new renderings, Android updates, 50 MP sensor, lots of info

July 14, 2021 By Amer Bekic

Nigerian Bitcoin Trading Volume At Highest Despite Central Bank Restrictions!

July 14, 2021 By Amer Bekic

Google to challenge EU antitrust fine of 4.3 billion euros

July 14, 2021 By Amer Bekic

This hidden feature on iPhone is a spy tool

July 13, 2021 By Amer Bekic

NVIDIA: GeForce RTX 30xx SUPER coming to laptop?

July 13, 2021 By Amer Bekic

Microsoft buys cybersecurity start-up RiskIQ

July 13, 2021 By Amer Bekic

Android 12: you can play games before you even finish downloading

July 13, 2021 By Amer Bekic

Cyber ​​defense center opens in New York

July 13, 2021 By Amer Bekic

After Google, Apple could be the next target of US antitrust crackdown

July 12, 2021 By Amer Bekic

At 16, this teenager became a millionaire by selling PS5s at a premium

July 12, 2021 By Amer Bekic

A derivative of the PS5 architecture will be found in more than 80 system designs

July 12, 2021 By Amer Bekic

Galaxy Unpacked: all the new Samsung products may have just leaked

July 12, 2021 By Amer Bekic

124-year-old hydropower plant uses its energy to mine Bitcoin

July 11, 2021 By Amer Bekic

Ukrainian law enforcement has shut down 3,800 PS4 crypto-farm

July 11, 2021 By Amer Bekic

Mac’s battery life with M1 is so good Apple thought there was a bug

July 11, 2021 By Amer Bekic

Kaseya: the ransomware behind the attack was programmed to avoid Russian-speaking systems

July 10, 2021 By Amer Bekic

AI could also be used to create scientific disinformation

July 10, 2021 By Amer Bekic

Windows 11: the search bar back in the last preview

July 10, 2021 By Amer Bekic

Gettr: this new social network was launched by someone close to Donald Trump

July 10, 2021 By Amer Bekic

Windows 11: only recent versions of Windows 10 will be updated directly

July 9, 2021 By Amer Bekic

Study blacks out YouTube algorithm

July 9, 2021 By Amer Bekic

China continues crackdown on cryptocurrencies

July 9, 2021 By Amer Bekic

OnePlus recognizes limiting the performance of applications for greater autonomy

July 9, 2021 By Amer Bekic

Study: Apple Watch and Fitbit Could Detect Long-Term Effects of COVID-19

July 8, 2021 By Amer Bekic

Microsoft is deploying an urgent patch to correct the PrintNightmare flaw

July 8, 2021 By Amer Bekic

China claims to have the world’s most powerful quantum computer

July 8, 2021 By Amer Bekic

Three Tweets related to EtherRock sales

Not one not two, Three digital pet rock cliparts sell for $600K each

August 22, 2021 By Amer Bekic

PolyNetwork Hacking Incedence

Hacker who stole $800 Million, now offered a white hat job by its victim firm

August 20, 2021 By Amer Bekic

credit card back panel containing the magnetic stripe

MasterCard announces future without magnetic stripe on the back.

August 17, 2021 By Amer Bekic

Fortune magazine sells its cover art as NFT. Raises 1.3 Million dollars

August 14, 2021 By Amer Bekic

Bored Ape Yacht Club Token 3749

This bored ape pic just sold for 1.29 Million dollars!

August 14, 2021 By Amer Bekic

Footer

Search this site

Recent Articles

  • Not one not two, Three digital pet rock cliparts sell for $600K each
  • Hacker who stole $800 Million, now offered a white hat job by its victim firm
  • MasterCard announces future without magnetic stripe on the back.
  • Fortune magazine sells its cover art as NFT. Raises 1.3 Million dollars
  • This bored ape pic just sold for 1.29 Million dollars!

Browse Topics

  • Blogging (164)
  • Content Marketing (7)
  • Cryptocurrency (5)
  • Digital Marketin (4)
  • Digital Marketing (333)
  • E-commerce (122)
  • Google (195)
  • Google Ads (24)
  • Marketing (166)
  • News (344)
  • Seo (74)
  • Social Media (43)
  • Technology (27)
  • Uncategorized (82)
  • WordPress (22)

Tags

Amazon Android Apple Apple Watch Artificial Intelligence B2B Bitcoin Blogging China Content Content Marketing Coronavirus Cryptocurrency Cybercriminals cybersecurity Digital Digital Marketing Facebook Gaming Google Increase Sales Instagram Intel iOS 15 iPhone IPhone 13 Make Money Marketing Microsoft NFT Nvidia Online Marketing Samsung Science Search engine optimization Seo SEO optimization by content Social Media Social networks Technology TikTok Twitter WhatsApp Windows 11 YouTube

© 2019–2022 · SwaCash.com