Will fully automated AI blog destroy the Content is King Era?
A fully automated AI blog is not a fantastical idea at all, since AI writing tools such as Jasper powered by GPT-3 algorithm are already coming to the fore.
In addition, if Jarvis is used in combination with Grammarly, it can create quite readable and “original” content.
It is only a matter of time or maybe even just a few years when this algorithm will get to such a level that it will be able to generate content or write articles itself and will manage the operation of the entire blog.
Can you imagine having a fully automated blog that can generate hundreds of thousands or even millions or billions of posts every day?
You could simply enter parameters such as the number of articles, their topics, length, tone or structure, or you could even leave the choice entirely to artificial intelligence.
Benefits of AI Blog
Please close your eyes and imagine how much time, effort and money this would save you.
In addition to these benefits, such a fully automated AI blog would also provide you with very high traffic to your blog as a whole or to the webpage of which the blog is a part.
The answer is simple. Articles created by artificial intelligence would be optimised in terms of on-page SEO and on-page SEO is an indispensable part of the overall SEO of your blog.
If you have a good SEO, you have a good position in SERP; if you have a good position in SERP, you have high traffic; where there is high traffic there are ads, services and products, and where there are ads, services and products there is money.
I will briefly mention what makes a quality article and, therefore, what positively contributes to your SEO.
- Keyword density
- Reasonable article length
- Non-Text Elements
Awesome; we are all looking forward to this as we believe that, in a couple of years, the contents of webpages and blogs and also on-page SEO will be taken care of.
However, the opposite is true.
Software vs Brain
The problem is that every algorithm that will be behind such a fully automated AI blog is made by humans.
Since no man can create anything more intelligent than himself (at least not yet :), such an article will not be a qualitative contribution to the Internet and its users.
The text or article produced by such an algorithm is just a rewriting and paraphrasing of the original ideas from several different sources!
It will not be plagiarism, but from a qualitative point of view it has no added value, neither for humans nor for machines such as search engines.
The search engine of Google LLC or more precisely Alphabet Inc. prefers original content and, therefore, every quality blog should consist of articles that have been well-researched and, especially, that have been developed on the basis of brainstorming the topic, knowledge and experience of the author, the results of various tests and analyses, and human impressions.
In addition, the generation of articles by AI suppresses the need to use reason in the preparation of a given article, which is a very bad and unproductive thing.
AI blog and search engines
Another problem is that if you can have a fully automated AI blog, anyone else can, too, and if everyone generates millions of generic posts every day, the blogs will have no real value.
It is like the price of gold and sand. Why is gold more expensive than sand? Simply because there is less of it in the world.
We can thus deduce that if there is a lot of something (the same or a very similar thing) it is not possible to use it as a differentiator, and it is the differentiators that play a key role in sorting the results in the SERP.
A simple rule applies to sorting in general. If you want to sort anything, you have to find differentiators (or, differences) between the individual items.
In order to be able to rank the results and display them in the SERP, search engines must find certain differences between the results (or, more precisely, between the sources of those results) based on (technical, on-page, off-page) SEO and a number of other factors, including manual interventions (for reasons that will remain a mystery to us).
The important thing is that search engines also have sorting (ranking) factors divided based on their importance.
It is also generally true that if you are praised by someone else (especially if it is a credible source) it is a more important factor than if you praise yourself.
In the SEO “language” this means that off-page SEO is a more important factor than on-page SEO, especially if it is a backlink from a high-authority website. In other words, such a backlink can be understood as your praise (or praise of your webpage) or an expression of respect.
Why is off-page a more important factor than on-page (and always will be)?
In short, because off-page depends on a third party and not on the blogger, webmaster or SEO specialist himself/herself, and to convince a third party to praise you and express respect requires some effort and especially merit.
If it were the other way around, everyone would ensure a 100% on-page SEO, and on-page SEO would thus no longer be considered a differentiator.
Have you noticed that?
If the on-page SEO of each or most webpages is the same or very similar, on-page SEO will lose its importance as a ranking factor!
With the advent of fully automated AI blogs, on-page SEO will cease to be understood as a ranking factor and the text as such will have no meaning for search engines.
Will AI blog mean the end of search engines?
You may ask if fully automated AI blogs will mean the end for search engines as well.
Probably not the end, but certainly a revolution.
Search engines most easily analyze (crawl and index) characters (ASCII, UTF), digits, words, phrases, sentences and text.
Text analysis is necessary in order to understand, summarise and express the content of a given text, article or blog in just a few words or phrases.
Keywords are the source of organic traffic from search engines.
If you type, e.g., “The Best Blog” into Google, Google will search the database for articles summarised by that keyword (key phrase).
Of course, there are various other commands such as “ ”, which will show you sources with the exact text you entered in quotation marks, but I am mentioning that just for the sake of argument.
In other words, each word or phrase you are trying to search for is a keyword belonging under a certain content of a certain webpage.
Will text lose its relevance?
If the text loses its relevance, the content will have to be created using static, dynamic and mainly functional (interactive) elements.
A big disadvantage is that not all these elements can be analysed in detail and fully used as a comparison and ranking factor. We could say that this will not help on-page SEO much.
The search engine will know that the element is there but will not be able to understand its main essence.
However, if such an element increases the average time users spend on your page, it will be considered a positive factor. The search engine calculates how long it would take on average to read a given article, and all the extra time that users spend on your article will be understood that they were probably motivated to do so by that very element.
A 1,000-word article takes on average, say, 4 minutes to read and understand. If the average time spent by users on the article is 6 minutes and there is a functional element (e.g. input field and button) there, then with high probability that element will motivate people to stay longer.
If you use Chrome as your browser and this feature sends requests to the server of a given page, Google is able to analyse the number and frequency of HTTP requests (e.g. clicking on a button) that visitors have made as well as the data they have received on a given page.
Static elements: Images / Icons / Infographics
With images it is more difficult to determine and even sometimes impossible to analyse what a given image expresses and whether it is even relevant to the topic.
Although shape detection in images is booming, this technology is still in its infancy and, for certain cases, it always will be.
To help search engines better understand the meaning of an image, it is recommended to list the ALT attribute in the <img alt="Image description"> tag.
Let us look at this from a different point of view and find out what absolutely devastates images as a positive element of on-page SEO.
These are royalty-free images and all their possible variations.
It is again true that if, for example, two hundred webpages use the same image in the same topic, even if the technology successfully detects it, it will still be just the same image.
No difference, no positive contribution!
Therefore, using only your own original images will be necessary.
Dynamic elements: Video / Animation
In terms of analysis, video is beyond the limit.
The only thing that can be analysed from a video is the transcribed text. We know, however, that video is not made to be read but rather to better understand what would otherwise be read.
Unfortunately, it is not possible to assign relevance to this fact, nor is it possible to index it and then use it as a comparative factor.
For videos, quality factors such as the number of views (especially from different IP addresses), the length of viewing, the number of shares and comments are considered.
To help search engines better understand the meaning of the video, I recommend to explicitly list the title attribute in the <video title="Video description"> tag.
Functional elements: Plugins / Scripts / VR
Apart from images and videos, there are other and more effective elements that can be used to slightly increase your on-page SEO and thus differentiate yourself from the others.
These elements can be called functional elements such as plugins, various scripts, chat or, in the near future, virtual reality (VR).
You will be able to try on the clothes you are just browsing, hold the smartphone you are about to buy, or test-drive a car from the comfort of your home. With VR you will be able to better decide whether you are really interested in buying a given product.
Such an interactive article will also be better evaluated from the point of view of the intent (purpose) of the search.
For this example model, let us disregard all other SEO factors.
If you create an article about timekeeping and time zones which will consist only of text and images, it will be worse evaluated from the point of view of the intent than if you insert a script showing the time in each time zone, enable users to convert the times between the individual zones, or allow users to find out what the current time is in any city in the world.
If someone wants to learn more about time zones and types “time zones” into the search engine, it is more beneficial for the searcher from the human point of view (as it involves more sensory organs) and more relevant in terms of the search engine if we can offer adequate interaction, besides text and images, which corresponds to what they are searching for.
In this example, converting between time zones or finding the current time anywhere in the world.
If on-page SEO loses its importance, how will search engines rank search results?
As already mentioned, besides on-page SEO there is also technical SEO and off-page SEO.
Technical SEO is a trivial matter and any at least little bit skilled webmaster can manage it.
And, again, we need to point out that if almost everyone can do something, technical SEO also loses its relevance and cannot be used as a primary differentiator.
Will off-page SEO be the only relevant ranking factor?
The first two types have been eliminated and we are left with only off-page SEO.
With the advent of fully automated AI blogs, the importance (weight) of off-page SEO will increase tremendously.
Positive feedback and backlinks from high-authority websites will be the key and major factors that will influence the PAGE RANK and consequently the position of your blog in SERP.
I know that this principle applies even now, only the difficulty to gain a better rank by even one position will increase exponentially.
For this reason, I highly recommend you to start with a quality link building now and, if you are already engaged in the process of link building, to increase its intensity to make your webpage become an authority as soon as possible.
The goal of search engines is to provide the most relevant answers (and making money) to search queries and, therefore, search engines will give preference only to high-authority and trustworthy sites, which will cause a sharp increase in the already high competition in the fight for which site will be perceived as an authority and will incredibly increase the difficulty of link building.
For new webpages it will be virtually impossible to get organic traffic from search engines without a monetary investment in white hat link building.
Guest posting will completely disappear and so will PBN.
The only way to quickly get to the first place in the SERP will be PPC advertising, which will increase the demand for this type of ads and, if the demand increases, so will the price.
How to prevent toxic content generation?
Is there really no possible solution to prevent such toxic content generation?
Fortunately, there are solutions to prevent the arrival of such fully automated AI blogs and one of them is asynchronous loading of the page content and the choice of the same colour of the text and its background.
The visitor will still see and understand the page content the same way, but the crawler that sends the AI technology will not “see” the HTML code that was loaded asynchronously and will not be helped by the page rendering either because it will not be able to distinguish the background from the text.
If it cannot see the text, it cannot save it in the database and will not have anything to generate paraphrased and unoriginal content for its articles and thus fully manage the blog.
By the way, this is already happening, in case you did not know. Those applications mentioned in the introduction also send their crawlers to collect content from individual webpages from the whole Internet.
The problem is that search engines also send their crawlers to collect the content of webpages and then index it; this way they would not see anything either, and you need to have indexed content to display it in the SERP.
A solution would be to detect the incoming crawler; if it was a crawler to which you wanted to display the content, you would load the content in the usual way.
Users would only have access to the content after CAPTCHA authentication.
I will not mention any commands like “noindex” or their modifications as these commands can be simply ignored by the crawler.
Basically, any solution must be based on the principle of showing content only to whom you want.
The advent of AI writing tools has started the process of flooding the Internet with unoriginal content.
The gradual improvement of algorithms on the basis of which AI writing tools work will cause that neither humans nor machines will be able to distinguish and determine with 100% probability what is original and well-researched content and what is a mere rewriting or paraphrase.
On the one hand, this benefit will allow us to entrust the generation of content fully into the hands of artificial intelligence but, on the other hand, it will mean disaster in terms of SEO and increasing organic traffic from search engines.
If search engines are not able to distinguish whether a given text is original or a perfect rewriting, they will omit this factor and consider it likely that the original text comes from a high-authority webpage.
In other words. If you generate or write (it will not matter because the text will lose its relevance) an article on a newly created blog, and even if it is the most comprehensible on the whole Internet, the search engine will prefer a similar article from, for example, the Forbes magazine.
The vast majority of sites will replace SEO, organic traffic and link building with PPC advertising.
The demand for influencer services and online marketing as such will increase enormously.
Almost no new blogs/websites will be created and there will be a drastic increase in long posts or building pages directly on social networks, just to put the content directly in front of people’s eyes so that it is not dependent on the quality assessment (SEO) by search engines.
In other words, if you have a high conversion in an article in which you have, e.g., an affiliate link, you as the owner of the article will not care whether it was generated by a machine or a human, as long as you get it to your target group.
As text as such loses its relevance, search engines, including Google, will depend more and more on human references in the form of backlinks to be able to sort and rank the results.
Maybe even backlinks will be further divided according to relevance.
In short, search engines will have to look for still new factors of differences between individual pages, and it certainly will not be text.
Increasing dependence on the human factor in SEO assessments will be a huge step backwards for them, as their goal is to replace the human factor with machine learning and “perfect” artificial intelligence.
It will not matter what the content is but where the content is.
Will you be satisfied if the search engine shows you results that are only a very good copy of the original?
Do you prefer a fake or the original?