Google
works well on a new algorithm that allows to determine whether the contents of
a web page are correct, or how exactly they are. Google search engine currently
uses the number of inbound links to a website as a substitute for quality,
determine where it appears in the search results. So many pages that link to
other sites are ranked higher. This system has brought the search engine as we
know it today, but the problem is that websites full of misinformation can
climb in the standings, if enough people link to them.
A
research team at Google is adapting that model to measure the reliability of a
page, rather than its reputation through the web. Instead of counting inbound
links, the system - which is still alive - the number of incorrect facts in a
page.
The
whole thing is called Knowledge-Based Trust (KBT). It is probably not a matter
of backlinks as a ranking factor number 1 to replace, but to develop a further
signal to make it less important and to be able to evaluate the content quality
better. Initial tests are probably very promising, according to the author.
This
is quite interesting I think. You can see how much being able to assess Google
strives content quality better. So far, it works so mediocre and probably
mainly based on user signals. How this user can exactly improve signals without
changing even a single letter on the website.
No comments