SEO Tactics To Retire in 2024 | GrowTraffic

It makes sense that some SEO tactics need to retire. Google changes a lot. More than many people realise. In 2022, it changed 4,725 times. That’s 13 times a day. That doesn’t include the tens of thousands of experiments it does.

Most are minor and we don’t really notice.

Some we know about because rankings change suddenly.

Some we know about because Barry Schwartz tells us (may the universe guard and protect him).

And some we are forewarned about as being industry shattering, SEO-slaying shifts that make us all fear for our lives and wake up in a cold sweat.

If we changed tactic every time there was a ranking fluctuation or Google changed, we would never know what needed to change-what needed to be improved. And, crucially, what now (or still) works.

Keeping abreast of these changes is daunting, especially as the industry tries to keep up with the development of AI. It feels like the changes have come thick and fast in the last few years, particularly after a period of relative stability pre-2019. (I say relative because there was still a shit tonne of updates, just not as many).

One small caveat here: A lot of the changes Google makes take time to be impactful. The efficacy of some of these tactics has been waning for a while, but in some circumstances will still work a bit.

This isn’t a lecture on what to stop doing IMMEDIATELY and undo on your websites. It is a gentle phasing out in return for something much more effective.  

The core message of Google has been the same for a long time:  

Produce correct, quality answers and content for people that they can trust.

Don’t try to game it.

Don’t try to hack the system.

Don’t spam people.

Don’t be sloppy.

Be awesome (that’s the actual word they use.)

They are a business. And so, they want to provide the best search results for their customers. (I strongly suspect the Helpful Content Update in 2023 was about reducing server space, but maybe that’s just me with my tinfoil hat on).

They want to stop bad practices that try to game their lovingly crafted algorithm.

And so, they keep refining it, which means we need to evolve.  

So let me take you on a journey.

To a place where food is pureed, nappies are optional (I hope), and the drugs are plentiful (I hope).

A place we have taken our once loved and treasured SEO tactics to, to let them lay down their weary heads and thank them for their service. (It’s a utopia, just go with it.)

The Nursing Home Of SEO. Let’s meet the residents/ learn which SEO tactics to retire in 2024.

Resident Number 1: AI Generated SEO Content

I don’t want to retire this one, so much as put it in a bin, pour petrol on it and set it on fire (anthropomorphising SEO tactics has taken an unexpectedly dark turn. Sorry about that).

Technically it has not really been a ‘tactic’, nor has it been around a long time.

But I will take ANY opportunity to beg people to not use AI generated content on their website for SEO purposes.

I’m sure none of you do. But just in case.

It is garbage.

Please stop.

I’m being hyperbolic. I should say, AI generated content is not a reliable way of improving your user signals, ranking capabilities or authority.

Let’s look at why.

Why Retire It?

Firstly, AI generated content doesn’t rank well. According to Neil Patel (lovingly referred to as Uncle Neil at GT HQ), 94% of top ranking web content is human generated. In the vid linked there, he also highlights it doesn’t drive traffic. People can tell it is AI, and so steer clear. The top SERP results are human generated content. The cycle continues and AI generated content continues to slip down the SERPS.  

Not to mention, Google is actively cracking down on websites that rely heavily on AI generated content.

Look at articles written before March 2024, and they will mostly say AI content is OK for SEO as long as it is reviewed by a human.

But then, the March update happened.

The March 2024 broad core update was targeting spam, but many reported websites/pages that were using AI were penalised, dropped rankings or, in many cases, deindexed (including John Mueller’s! Coincidence, or has John been a lazy boy?)

Now, there is an old saying in SEO, that correlation is not causation.

So, it could be possible that the sites that were using AI to generate content were spammy in other ways. If you are using AI to generate SEO content, you are likely trying to game the system, some other way, right?

For example, some of the sites that were deindexed were created solely to make 1 specific keyword rank, create authoritative backlinks or as personal content depositories.

What Should We Do Instead?

Replace it with well written content using your own expertise.

Now I’m not saying never use AI to help your marketing efforts. It can be useful, time saving, inspirational and insightful. We use it for keyword research and to help structure blog content. It is great for refining a strategy, or pulling some research together.

Where it falls down is if you ask it to write you a blog, web page, product description etc, and just proofread it and whack it up.

So why shall we retire AI generated SEO content?

Well, consider what Google is looking for.

  • Accuracy
  • Authority
  • Created for humans.
  • Original

How confident are you that the information it is using is accurate? Chat GPT in particular is well known for using out of date information and SERP results.

You could potentially be linking to spammy website with low DA or low relevancy. You are not utilising any of your expertise to create that content. You aren’t adding anything new or generating authority.

It is CLEARLY written by a bot. For example, the overuse of commas is a dead give-away. It doesn’t sound human. It doesn’t feel human. It doesn’t engender trust.

And it runs the risk of not being original content. AI by its very nature uses other’s content to inform and teach it. As we all know, original content is essential for SEO.

Up until the end of Feb, we were all saying that, whilst AI content won’t harm your SEO efforts, it probably won’t help either. Now we aren’t so sure. Yet.

My prediction is it will continue to favour human written content for humans that is high quality and so will, eventually come down hard on sites that use AI too much.

Resident Number 2: One Page Per Location For Local SEO

A couple of years ago, if you wanted to rank a carpet cleaning business that serviced South Yorkshire, you might structure an area of the site like this:  

  • Locations
    • Carpet Cleaning South Yorkshire
      • Carpet Cleaning Sheffield
      • Carpet Cleaning Rotherham
      • Carpet Cleaning Doncaster
      • Carpet Cleaning Barnsley

Your URLs would be very similar for all these:

www.businessname.co.uk/locations/carpet-cleaning-south-yorkshire/carpet-cleaning-sheffield

www.businessname.co.uk/locations/carpet-cleaning-south-yorkshire/carpet-cleaning-sheffield

www.businessname.co.uk/locations/carpet-cleaning-south-yorkshire/carpet-cleaning-sheffield

Then you would perhaps write some blogs that tackled long tail questions, like ‘How To Find A Carpet Cleaner Near You’, ‘Best Carpet Cleaners In South Yorkshire’ etc.

Why Retire It?:

The Helpful Content Update would consider these ages too similar and lacking value They have clearly been created just for SEO purposes.

It might go so far as to only index one of the pages. (But which one? More below)

UNLESS the content on them was vastly different and the URLs were more unique.

So, you could have a ‘carpet cleaning in South Yorkshire’ page, and a blog about quality carpet cleaning in South Yorkshire, because the URLs would be different, the intent is different, and the messaging will be different. (although it could be argued that, too would be considered a bit thin).

What To Do Instead

If you genuinely need this sort of content on your site (for a location-specific booking engine, for example) you would utilise a canonical tag on the primary page to show that you wanted that page to be the parent (canon) of the other pages. It shows that you know they are linked and similar but the one with the canonical tag has value and should be indexed, not one of the others.

Otherwise, to rank locally, you stick to the good old basics of local SEO:

  • Schema markup.
  • Local Directories
  • Google My Business
  • Locally relevant content
  • Local link building
  • Local reviews

Some people do still report this tactic working but as the Helpful Content Update continues to be refined and becomes more impactful, this is definitely a tactic that needs turning out to pasture.

No need to start redirecting and removing pages, however.

Resident Number 3: One Keyword Pages

Pre-2020, a solid strategy to get a keyword to rank was to churn out LOADS of content on a particular keyword, its semantics, secondary and long tail keywords, to get it to rank. The more the better.

So, if you wanted to rank for the key phrase ‘carpet cleaning’, you would have a page (or post) for each of the following:

Primary Secondary Semantic Longtail
Carpet Cleaning

Rug Cleaning  
Carpet Shampooing

Deep Carpet Cleaning  

Professional Carpet Cleaning Service  

Eco-Friendly Carpet Cleaning  

Steam Carpet Cleaning  

Carpet Cleaning Prices

Carpet Stain Removal

Upholstery Cleaning

Pet Carpet Cleaning   Carpet

Odor Removal

Carpet Mold Removal

Best carpet cleaning service near me + [location]  

Professional carpet cleaning for pet hair removal  

Same-day carpet cleaning for emergencies
 
Carpet cleaning for oriental rugs

Eco-friendly carpet cleaning with non-toxic solutions

Carpet cleaning for high traffic areas  

Professional carpet cleaning for a move-in/move-out

Residential carpet cleaning service for same-day booking

Carpet cleaning discounts for first-time customers

Carpet cleaning for allergies and asthma sufferers

Deep carpet cleaning to remove dust mites and bed bugs  

Professional carpet cleaning for water damage restoration

Eco-friendly carpet cleaning with plant-based solutions

Carpet cleaning for upholstery and furniture  

Pre-holiday carpet cleaning service

As a result, the internet was chock full of superfluous content that was adding very little value.

Why Retire It?

Google decided to clamp down on this practice and instead reward rich, informed, well researched articles and pages. (remember, EEAT).

Since the Helpful Content update started rolling out, it is far better to have longer pieces of content that capture several topics and make better use of H-tags and a Q&A style, aiming for a snippet.

What To Do Instead?

Now we have started grouping those topics into primary and secondary keywords and creating longer form content that focuses on answering all the follow up questions. Amalgamating topics so it follows the natural flow of a buyer journey.

So, for example, one blog might be more like this now:

Title: Our Specialist Carpet Cleaning Services

Contains:

  • Allergies And Asthma Sufferers
  • Remove Dust Mites And Bed Bugs
  • Water Damage Restoration
  • High-Traffic Areas
  • Pet Hair Removal
  • Oriental / Delicate Rugs

As usual, focus is on ensuring the content is well researched, has high quality links out, an internal link structure that makes sense and in genuinely useful to a human.

Another great way to replace this strategy is through FAQ style pages. This helps prepare your content for AE (answer engines).

Your SERP might be looking a little different these days.

This is the future of Search Engines: Answer engines. They rely on Q&A style content to give quick answers.

Its experimental and we don’t yet fully understand how Google wants us to prepare or optimise, but most recently updates have all focused on forcing quality websites that are lean, most likely in preparation for an AI model.

So, it is more important then ever to write content that is answering questions people are asking. Again, we are back to E-E-A-T.

Resident 4: Exact match keywords

When I first started doing SEO we had a jeweller as a client. We could not get traffic to that website for any key phrase that contained the word ‘jewellery’.

Search volumes were tiny.

So, we wrote a few blogs on ‘jewelry’, and one on ‘jewellery vs jewelry’. We added a solid internal link structure. And it flew. Traffic exploded.

The reason was Google hadn’t yet made the distinct connection between ‘jewellery’ and ‘jewelery’. People were googling two different spellings, and we weren’t getting the rankings or traffic for the former.

Then we said ‘Hey BERT’ in 2019 and it started to change. (Sesame Street reference there for the other millennials).

Google made huge leaps in understanding how human’s search and could predict the next 3 or 4 questions you might ask. It began to make links between content, colloquialisms, semantics and so on.

We no longer need to have Exact Match keywords in our content.

In fact, it is better not to.

What To Do Instead?

Now, rather than writing a page for each keyword (similar to above), it is best to include the semantics in your copy.

It is more natural that way.

So, for example, if you are writing a blog about tips for carpet cleaning, you might use ‘ways to clean a carpet’, ‘rug cleaning’, ‘sprucing up your carpet’ and so on, and Google will know they mean the same thing.

I’ll show you.

So, let’s use the search query ‘tips for cleaning a carpet’.

The top pages are optimised for different things:

Image of a SERP showing the variety of keywords ranking for a certain query
Image of a serp showing result 2+ for keyword variations for a blog on GrowTraffic about SEO tactics to retire

The key phrases in these articles vary: carpet cleaning, deep clean carpets, spring clean carpets.

They are all semantics of the same concept but they are all ranking for the same query.

When we combine this outdated strategy with the one above, you build up a picture of Google wanting long form content that answers multiple queries within one article. Less content, working harder.

Resident 5: Link Spam

The efficacy of building large quantities of backlinks is questionable and we often see very relevant, well written websites with very low domain authorities consistently ranking in the top 20 for a variety of queries.

For the broad keyword ‘Royce Gracie Jui Jitsu’, you can see a variety of Domain Trust scores in the top 20 results:

Screen grab of SE Ranking showing Domain Trust for a blog by GT on SEO tactics to retire

However, backlinks demonstrate E-E-A-T, particularly the Expertise and Authority elements, and can generate valuable referral traffic, so they aren’t ready to retire yet. They still have plenty to give.

But their lazy colleague, ‘buying backlinks’ can retire.

There are some websites that sell cheap space and a backlink. You can put a blog up about anything and there are some loose categories. Many people don’t really see anything wrong with this and still include it in their marketing mix.

But since the spam updates that have periodically rolled out over the last few years, it is a bad idea.

It will probably work in the short term still, but long term your website’s spam profile will increase, and you might even get hit with a penalty.

Why Retire It?

It is against Google Web Master Guidelines.

Link spam in general will get you hit with a penalty. Link spam is:

  • Buying links, exchanging services for a link, or providing a gift for review and a link
  • Partner pages purely for the purpose of linking.
  • Lots of cross or reciprocal linking
  • Automated link building
  • Requiring a link as part of a terms of service (we see this a lot in directories)
  • Text advertisements or text links that aren’t rel=“nofollow” or rel=”sponsored”
  • Low-quality directory or bookmark sites
  • Widgets embedded across various sites, particularly if they are hidden, low quality, or keyword rich.
  • Lots of links in the footers or templates of other sites
  • Forum comments with optimised links
  • Creating thin content just to manipulate links or rankings

 What do all these things have in common? They aren’t genuine recommendations or endorsements.

I’m not saying it is time to stop backlink building, lets just phase it out.

What To Do Instead?

Instead, create content that is well researched, original, useful, informative, contributes to the industry, or genuinely helps people.

Share it far and wide. Offer it freely. The backlinks will come. There may be fewer, but they will be genuine and so much more valuable.

Resident Number 6: Over-optimising

Over-optimising hasn’t been working for a while, but at GT, we still work on websites that have been over-optimised, either because a well-meaning but uninformed person has worked on it before it came to us, or because it hasn’t been re-optimised in a few years and the practices have become outdated.  

A lot of the Google updates have been targeting elements of over-optimisation, either by phasing out the practice and replacing it with something better (BERT, for example) or by actively removing certain techniques (for example, Exact Match Domain of 2011).

As I mentioned at the beginning, a lot of these updates happen under the radar, and we never know. They are tweaks. Sometimes, we get told to change our practice. Sometimes we get forwarded to start bringing our websites in line, like with the Core Web Vitals update in 2021.

Some of the elements of over-optimising include:

  • Keyword stuffing
  • Exact match domains
  • Keyword rich anchor links
  • Meta description on every single page
  • Fixing every crawl error
  • Multiple H1 tags on a page
  • Too many internal and external links on each page
  • Emojis ☹
  • Writing lots of thin content

Why Retire It?

Some of these optimisation tasks take a long time and don’t give you much benefit. This article on Ahrefs has a good insight into the rule of diminishing returns of over-optimisation.

And some of it is considered spammy.

Keyword stuffing, for example, has long been known to be spammy. Thin content is being specifically targeted in recent years.

What To Replace It With?

If you are still on this field trip with me and not sat in the minibus eating your packed lunch by now, you will hopefully have begun to build up a picture of what Google wants:

Genuinely helpful content that is written for humans, not bots.

Over-optimising is trying to manipulate their bots.

Therefore, Google don’t like it. And as we are all at their party, we do what they tell us.

Instead of trying to hack the system, just create a site that:

  • has considered the user journey.
  • is well written and researched.
  • can be crawled.
  • is secure.
  • is not trying to manipulate anyone into doing anything.
  • provides a good experience to the user (core web vitals)

OK, back on the bus. Time to leave the retirement home of SEO tactics.

These tactics need gently phasing out, rather than burying straight away so don’t feel the need to radically reoptimize your site straight away.

If you have made it this far and want to ask me a question or get some help on your own SEO, you can email me on [email protected].

Source link

#SEO #Tactics #Retire #GrowTraffic