Is dynamically generated information (DGI) from more trusted sources the future of search engines?

Is dynamically generated information (DGI) from more trusted sources the future of search engines?

Google says: “Our mission is to deliver the most relevant and reliable information available”.

But what if that most relevant information will only come to be if it consists of a combination of several trusted sources?

Search engines have to display relevant answers, otherwise they would, simply put, decrease in popularity, and users would start to use them less; and if search engines lost users, they would also lose customers and billions of dollars they generate thanks to those customers to whom they provide advertisements.

Let me start with a question.

Were search engines created to provide the most relevant information or to make billions of dollars?

The answer will vary depending on what social class you belong in and whether you are an employee or an entrepreneur!

Currently, the information displayed by Google is only as relevant and reliable as one of the sources in Google.

If you click on the first result on the SERP, you have clicked on the answer that Google considers the most relevant and reliable at that moment.

If you do not find anything there, you either go back to the SERP and click through the other results, or get distracted by other topics and content on that first page and completely forget what you were searching for.

Every time you click on another source, you are wasting your time as you have to start reading new content again and again, adapt to its tone and, most importantly, understand it.

The more sources you visit in this way, the harder it is for you to put the individual pieces of information into context and understand it as a whole.

However, it should be noted that not all searchers get the same order of answers to the same query on the SERP, so they may not always get the same answer, either.

The ranking of results on SERPs is the result of ranking factors of the source combined with the searcher’s geographic location, search history, and language and other preferences.

If not everyone gets the most relevant and reliable information available, we can immediately deduce that there is a diametrical difference between Google’s mission and reality.

You know how it goes then.

Incomprehensive, incorrect or misunderstood information begets more incorrect information, that incorrect information begets more incorrect information, and so on, until eventually a “webmaster or blogger” includes that distorted information in their content, and, Google, here’s another unreliable source for you, you are welcome.

By displaying one unreliable source, Google has multiplied it and generated more unreliable sources.

If Google wants to successfully fulfil its mission and actively combat unreliable sources and misinformation as such, it needs to make radical changes in the very displaying of information.

If you think that a quick solution would be reducing the number of results ranked on SERPs to, say, only 20, you can forget about it, as there are a large number of reliable websites, social profiles and documents, and what is even more important is that their number is growing every day.

For example, if only the top 100 officially disclosed billionaires comment on a particular investment opportunity, you get 100 different results from that alone, and I think that billionaires are a credible source when it comes to investing (especially in their own stocks).

Add to that the statements of traders, brokers, investors, businessmen, politicians, financial experts, academics, etc. and you have hundreds to thousands of different sources.

Google, now be wise!

You have thousands of reliable and relevant sources at your disposal but which is the most relevant?

Since the most relevant and reliable information/answer is formed by a superlative (third-degree adjective), it is only one; that is, it must consist of a single whole and must be given to all searchers of the query in question.

Therefore, ladies and gentlemen, allow me to introduce dynamically generated information – DGI.

What will the generation and display of DGI be like?

After you enter a query into the search engine, the AI will generate a human-readable answer (error-free and with smooth continuity of individual words, sentences and non-textual elements) for you that will consist of multiple trusted sources and will be displayed on a single page.

The AI, based on its inference, will determine the length and structure of the information, a balanced ratio of textual and non-textual elements, and the number of sources.

This way you will have the most relevant information from all over the Internet summarised in one piece of DGI.

An example of DGI for the search term “Tesla stock”:

  • The introduction verbally describes all the important milestones in the development of Tesla stock from the IPO to the present.
  • Below that, an interactive chart shows the price movements on the stock exchange (information from a trusted broker).
  • Excerpts from field-specific journals and/or peer-review papers (PDF documents) describing new technologies that the company has recently introduced or plans to launch soon.
  • A summary of the effects of current world events on the development of the stocks.
  • Evaluations, recommendations and various predictions by leading experts from various well-known journals and directly from their social profiles.
  • The most important tweets from Musk himself.
  • In conclusion, a brief summary of the DGI.

A huge advantage of DGI is that it can be standardised for a certain time interval, which means providing the same information/answer over and over for the same or semantically similar queries.

The DGI standardisation process is based on machine learning of source quality and searcher satisfaction factors.

This will ensure that the same DGI will be read by anyone, anywhere in the world.

It will further reduce the risk of the searcher selecting inappropriate sources and thus the risk of misinformation.

If the AI detects, after a certain period of time, that a given piece of DGI needs an update, the AI will update it itself and standardise it again for a certain time interval.

In short, DGI will operate on a similar principle to, for example, university textbooks prepared by highly professional academics.

Anyone who reads those textbooks will obtain exactly the same information (whether they understand it or not depends on their intelligence :).

If a new genius comes and discovers something new, the textbooks will be supplemented by that discovery and, again, they will become textbooks, and so on and so on.

It is also very necessary to point out that it does not matter if someone unknown (yet without recognition) discovers something new in the meantime; it simply will not be included in the textbooks unless the academic community recognises it!

What will power the DGI?

Already at this time there is a GPT-3 language model that can produce human-like text from multiple sources.

Search engines, in turn, have algorithms for calculating the authority of websites and webpages.

The algorithm that will generate DGI will be a combination of the two types of these algorithms that will have been further improved.

Advantages

From the searcher’s point of view, it is an excellent time-saving solution because he or she does not have to click through a large number of websites, social networks and documents and read (understand) often very long, repetitive and incomprehensible content.

In addition, the risk of misinformation that would otherwise be present when visiting a large number of different sources is significantly reduced.

Google’s current mission is nice but nobody can prevent a searcher from clicking on a result somewhere on page 20 on the SERP where relevance and reliability are rapidly decreasing.

What is even worse is when the searcher clicks (hyperlinks) from such a deep result to an even deeper source and from that to an even deeper one, etc.

Unlikely?

With 3.5 billion searches a day, if even one in every 10,000 answers is clicked on in the above-mentioned manner, then every day there are 350,000 answers read from very unreliable sources, which is a huge number and constitutes a failure of Google’s mission.

Not to mention how many artificial websites that are created just for the machine or to influence positions on SERPs (link farm, PBN) exist.

Another advantage will be the option to change the tone of the DGI according to the user’s preference.

The essence of the information will be preserved, only sometimes one understands something if it is said in a different tone or in different words. This is a piece of cake for GPT.

The advantage will certainly also be felt by the authors of all the sources (websites, social profiles, PDF documents, etc.) from which the information to create the DGI is used and which would otherwise not be visited when displaying answers in the conventional way (in sorted order).

Nowadays it happens that even if a certain source of information is on the first page on the SERPs, it does not mean that the searcher will always visit all the sources listed on that first page.

Disadvantages

The probability of a click on a particular source will decrease as the number of the sources used increases.

In its simplest form, the probability of a click will be P (click) = 1/number of sources displayed in the DGI.

The order of the sources in which the information is generated will also determine the probability of a click on a particular source.

If the information was disproportionately long, users would either click on the sources right at the beginning or at some points as they were scrolling or at the very end of the information.

The length (excerpt, scope) of the information from the original source will also be a factor that will influence the clicks on the original source.

For example, if 149 sentences from various other sources and 1 short sentence from your source were used, and the sentence from your source was placed somewhere in the middle of the information generated in this way, the probability of a click on your source in the simplest form would be P (click) = 1/150.

A lower number of conversions.

If the original source has, e.g., PPC ads integrated in the content, DGI will ignore them and “take” only the content around them.

How will the sources used be compensated for monetised content (PPC ads, banners, affiliate links, etc.)?

The above-mentioned disadvantages indicate that DGI reduces the traffic to the original sources, so all the sources used must be compensated for the effort of creating quality content and for financial losses from ads.

A solution is to award financial compensation to the sources used in adequate proportion to the amount of information they have provided for the generation of the DGI.

If DGI pays the sources for the content provided, will this replace the style of monetizing for content as we know it today?

Not quite, as not all visitors to the source come from the search engine.

Some people may have the source bookmarked, others may enter the URL of the source directly into the browser, or they may access the source from any other source via a hyperlink.

If DGI replaces the current ranked display of answers on SERPs, how will off-page SEO and especially Page Rank of individual pages be handled?

Each source will have to take more and more care to be perceived as reliable and to be chosen as a source by DGI.

Backlinks from high-authority websites will still be the top differentiator.

I would like to highlight the fact that not so much the backlinks themselves but rather the length of the anchor text will be the top differentiator.

Sources with so far low reliability will try to “sell” as much of their content/information as possible to DGI sources.

In other words, the longer the anchor text you have for the DGI source the better.

Comments, reviews and source sharing by authenticated experts on trusted sources will be taken into consideration as another top differentiator.

Page Rank will gradually change to DGI Rank and will play a key role in assessing not only the reliability of DGI sources but also in determining how much and what type of information is used from each source.

Conclusion

It is important to bear in mind that the arrival of DGI will not be sudden but gradual and that DGI will not be immediately applied to all queries; however, with billions of searches a day, every percent of results displayed by DGI will significantly increase the quality and consistency of information that the search engine supplies to its users and that, last but not least, influences those users.

If search engines only need to add GPT to their source reliability evaluation algorithm to generate DGI, GPT can add a source reliability evaluation algorithm and it will become the same thing!

Do you see what is happening here?

Let me answer you with a question:

Will future versions of AI writing tools (powered by GPT) replace search engines?

Do you want to know how AI writing tools affect search engines and your SEO? We recommend that you read the following article: Will fully automated AI blog destroy the Content is King Era?


Author Bio

Rafael W.

Head of the content. Rafael is an entrepreneur, investor, software developer, SEO specialist, natural problem solver and startup enthusiast.

0 Comments

Write a comment

Do you want
more clients?

We will improve your SEO,
increase traffic to your website
and optimise your business.

Newsletter

Do you want
powerful DR 90 backlinks?

We’re a team of expert link builders who focus on quality over quantity.

Do you want
more clients?

We will improve your SEO,
increase traffic to your website
and optimise your business.