On August 18th, Google announced a BIG algorithm change to address poor-quality content.
If you’re not familiar with the announced change, I’ll quote the crux of it as quoted by Google and a summary from Search Engine Roundtable.
In this post I’ll walk through WHY Google is making this update and HOW to optimize for it – if you’d just like to know the facts, I’ll link to some technical articles at the bottom of this post.
This ranking update will help make sure that unoriginal, low-quality content doesn’t rank highly in Search, and our testing has found it will especially improve results related to online education, as well as arts and entertainment, shopping and tech-related content.
And they provide an example of what is considered “unoriginal, low-quality content”
For example, if you search for information about a new movie, you might have previously seen articles that aggregated reviews from other sites without adding perspectives beyond what’s available elsewhere. This isn’t very helpful if you’re expecting to read something new. With this update, you’ll see more results with unique, authentic information, so you’re more likely to read something you haven’t seen before.
So what does this mean to SEO‘s and webmasters? Particularly those in online education, arts & entertainment, shopping, and tech – and the update also has a product reviews impact.
Well, in this humble SEO’s opinion – I’m quite pleased to see this change and hope it delivers on the promised intent. Black hat tactics have been creeping back into SEO effectiveness in recent years, driven by 3 factors: AI content tools, SEO content tools, and the sheer volume of content being created.
Combatting AI Written Content
To name names, Jasper.ai (previously jarvis.ai), Copy.ai, and other AI-driven content writing tools make it maybe a little too easy to usurp content from existing pages. If you haven’t tried these tools, they enable a writer to enter keyword prompts then the tool generates a paragraph or two by scraping and typically rephrasing content from existing ranking pages on the topic. It makes writing a lot faster but if the author isn’t refining their new article to be more original or have a point of view, then the content is a just a garbled version of the originals.
So should SEOs and content writers delete all their AI content and abandon these tools? Well, if you use them simply as a research tool and not a complete source of content, then you should be fine. If you are letting the AI write full pages and articles for you, then watch out.
Over-Reliance on SEO Content Tools
SEO content tools like Surfer and ClearScope are awesome, we use Surfer for every piece of content we write.
If you’re not familiar with them, they scrape the top ranking pages, find the most frequently used keywords and then recommend those terms to use on your new page. For example, if you were writing a page intended to rank for “basketball”, it would suggest you include terms like “slam dunk, free throw, LA Lakers, NBA” and the frequency to use them.
But, they are only tools for research and guidance not to be used explicitly to write the content end to end.
I saw first-hand when our freelance writers would use the frequency guides too literally and include the same keyword 27 times in the page so it just looks ridiculous, even though the tool graded it a 97 out of 100 for optimization.
The Volume of Content Being Created
Every day, millions of new pages are created and subsequently submitted to be indexed by Google. The volume increases every year and there are only approximately 10 spots on page 1 of Google (yes, I know this is changing, personalization, etc but let’s not go there now). Also most of those pages are frankly not that good – lacking original content, a unique perspective, poorly designed, unauthoritative, or otherwise just not as good as what already existed before it was written.
Web marketing teams are relying on SEO who is relying on content writers to bang out pages daily with more focus on keywords and ranking than if the page is actually valuable to the target audience.
So what should SEO’s do for the Helpful Content Google Update?
If you got lazy with your SEO content writing, then prepare for a savage spanking from Google over the next few weeks.
How to fix it:
- Crawl your website with Screaming Frog and connect data from Google Analytics and Google Search Console
- Review the pages with the most organic traffic (and possibly conversions)
- Rewrite those pages to include more original content, a unique perspective, a dialed-back keyword density (if needed), and otherwise just make the page something your target visitor would find valuable when searching the intended keyword topic.
What to do going forward:
If you’ve always written with reader-first, SEO-second priorities then you do not and will not have any concerns – and can only benefit as Google demotes weaker ranking content (essentially rewarding you).
Writing with a reader-first priority means:
- Learning what your visitors and customers want to know.
- Using SEO research to identify the pockets of keyword volume around those topics of interest
- Analyze the SERP to see how others are answering those questions – be it numbered lists, blogs, product pages, videos, or others. This is the blueprint to follow.
- Create your content with a unique perspective and directly answer the query of the user in a format they can understand, and at an appropriate depth.
- Then implement SEO elements applying your keyword search (eg, meta, H1, slug, internal links, image tagging, schema)
- Distribute the new content (if appropriate) via social channels and link-building outreach
- Monitor and revise as needed