The x Words Minimum Rule is a SEO-myth
The idea that there is some kind of magis minimum word count for pages, before they will rank, is mostly wack, and here is why!
By. Jacob
Edited: 2020-02-18 17:53
There is a persistent myth in the SEO world that some magical word-count will give you better rankings. Usually it is 300 words or 500 words, but while content-length does matter sometimes, this is not always the case.
Naturally everyone is interested in getting better search engine rankings. A persistent myth however, is that you need to reach x words in your articles, otherwise they won't rank well, or they might not get indexed at all.
As of the writing of this Beamtic has a number of articles, which are just below 100 words in length, and they are nearly all indexed. I think the smallest articles we have had, has been around 50 words!
But my point is not to get people to write shorter posts. My point with this article is that, while short pages are useful in some circumstances, they may not be so useful in others.
On Beamtic, I have created a tool to help me find potential low quality content, that might needs to be improved. I am not afraid to delete content of poor quality, even when it has incoming links and sometimes I also completely re-write older articles to improve them, in which case I would often delete the old article rather than redirect them.
Redirecting (301) deleted content is technically and semantically wrong, and gives either very little or no benefits in SEO at all. Web shops might do this in attempts to keep value from incoming links on dropped or expired products, but it is an ineffective long-term strategy of doubtful value. I would simply avoid it because it is semantically wrong according to the HTTP protocol.
How much or little is needed?
As you will likely find discussed in other articles on Beamtic, it is generally not recommended to write short. Writing long is usually better than short, but it has little or nothing to do with the length of the posts themselves. Rather it has to do with originality and uniqueness of your content.
If your content is unique you will nearly always rank regardless of how long your posts are, but if there is a million other bloggers writing about the same topics it is going to be harder to rank, and you may need to put more of an effort into your writing.
The ideal content length has often come up as a topic of debate in SEO communities, and while there might be some truth to the longer is better idea – as i also discussed above – it only seem to apply in some cases.
Competitive niches might be a determining factor
If there is a lot existing content in a certain field, long posts should in theory perform better in the search engines than the short posts. But even then there is still no guarantee that you will outperform other websites on the subject.
A lot of all this can be known almost intuitively simply by performing simple thought-experiments. For instance, you could perform such an experiment by asking questions like:
- What are the main ranking factors that Google uses, besides links?
- What is X page doing different to rank higher than Y?
But to answer a question like that, you would also need to know about how Google is ranking search results. Answering such questions precisely can be hard, but you can still get close enough, and that might enable you to make necessary changes for your pages to start ranking better in the search engine results.
Thought experiments like this are good, as it allows you to spend less time searching for information or doing experiments. In this case, a simple thought experiment proves that there really is no minimum word count. Google wants to index pages, as long as they are relevant to searchers.
Where does the 300 myth come from?
Presumably it originates from blackhat SEOs who got hit by the panda and penguin updates, which were algorithm changes aimed at improving the quality of search.
Panda was a change to Googles search algorithm that first appeared February 23, 2011 – so this is a long time ago – but the purpose of the update, was to boost the ranking of high-quality-pages, and to demote pages of lower quality. Generally the purpose of all updates to the algorithm, is to improve search, so we will not focus to much on the individual updates. However, in a blog post Amit Singhal from Google mentions something very interesting:
One other specific piece of guidance we've offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content. – More guidance on building high-quality sites
Which brings me back to one of my earlier points in this article, that is, the importance of updating old content, and even deleting our low quality content. By getting rid of out right garbage and improving old content with potential, you can actually end up improving the ranking for your website as a whole, and not just the individual pages.
Penguin focused mostly on unnatural links, that is websites that had a portfolio of fake links pointing to them. It has been typical of blackhatters to try and manipulate their rankings trough buying of links, or spamming the links themselves.
As these blackhat SEOs got hit, they likely invented the minimum 300-words idea, to avoid getting punished for thin content.
Tell us what you think: