Post by account_disabled on Nov 15, 2023 7:12:29 GMT
Nofollow link attribute was used incorrectly” Steven van Vessum, ContentKing “It's fascinating that Google had to make this change due to the fact that many sites were simply misusing the nofollow link attribute. Many sites often applied the nofollow to all outgoing links (for fear of penalties), and as a result Google developed a blind spot for these links. And if too many sites do this, then Google has to work with too small a data set to determine the popularity of sites. Then you start seeing crazy things in the SERPs, such as websites that score high but don't 'belong' there at all. That is why Google now also says.
We see the nofollow attribute as a hint”, and that allows them to better determine the popularity of sites.” “It has been coming for a while” Jan-Willem Bobbink, Notprovided.eu “It has been coming for a while: on the photo editor one hand, Google has been trying to identify unnatural and paid links on a large scale based on their own processes. A first attempt to enrich that analysis was via the disavow tool (SEOs send in striking links themselves), but that apparently did not yield the desired results. By now tasking webmasters with tagging sites.
There is a new influx of link data: predefined links in neat groups, so that Google's training data is enriched for free. On the other hand, too many links are provided with a unfollow directive, with the result that Google received insufficient data for a representative representation of the web. Little will change for the time being because these types of changes are only implemented directly by the major publishers. Then we have to wait for platforms such as WordPress that will also fill in the new directives by default for specific configurations.
We see the nofollow attribute as a hint”, and that allows them to better determine the popularity of sites.” “It has been coming for a while” Jan-Willem Bobbink, Notprovided.eu “It has been coming for a while: on the photo editor one hand, Google has been trying to identify unnatural and paid links on a large scale based on their own processes. A first attempt to enrich that analysis was via the disavow tool (SEOs send in striking links themselves), but that apparently did not yield the desired results. By now tasking webmasters with tagging sites.
There is a new influx of link data: predefined links in neat groups, so that Google's training data is enriched for free. On the other hand, too many links are provided with a unfollow directive, with the result that Google received insufficient data for a representative representation of the web. Little will change for the time being because these types of changes are only implemented directly by the major publishers. Then we have to wait for platforms such as WordPress that will also fill in the new directives by default for specific configurations.