The Theory of PageRank Algorithm
The PageRank algorithm evaluates web pages based on the theory “the more inbound links, the more significant the page”. Google uses algorithms as its ranking basis. However, in today’s online world, many more signals impact the ranking of sites, which is basically why PageRank loses its significance.
The PageRank algorithm focused on the number of incoming external backlinks and the ranking status of the linking page. Regardless of the page content, it’s better ranked when other authority sites linked to it. Thus, the ranking of a site was directly proportional to the evaluation of the linking page.
The link following fundamental follows the RSM (Random Surfer Model) – a searcher, who is erratically surfing the World Wide Web (WWW) and, thus, visits several pages.
The Algorithm
The PageRank of a web page is calculated recurrently:
PR(X) = (1-d) + d(PR(Ti) / C(Ti) + … + PR(Tn) / C(Tn), where PR(X) denotes the PageRank of the website X, and C(Ti) to C(Tn) denotes the number of backlinks on the pages Ti to Tn.
It’s a moistening signal between zero and 1. The link following isn’t pursued forever by the RSM (Random Surfer Model). This is what the moistening factor does in this formula. The algorithm is updated on a yearly basis.