De volgende Q&A voegt wat toe bij deze discussie.
Vooral deze vraag en antwoord passen bij het probleem van negatieve concurrent link vermeldingen. Vooral dit deel is duidelijk “If a webmaster wants to shoot themselves in the foot and disavow high-quality links, thats sort of like an IQ test”. Hieronder staat de vraag en antwoord waar ik dat stukje uitgehaald heb.
Question Danny Sullivan (searchengineland.com):
What prevents, and I cant believe Im saying this, but seemingly inevitable concerns about negative negative SEO? In other words, someone decides to disavow links from good sites as perhaps an attempt to send signals to Google these are bad? More to the point, are you mining this data to better understand what are bad sites?
Answer Matt Cutts (Google):
Right now, were using this data in the normal straightforward way, e.g. for reconsideration requests. We havent decided whether well look at this data more broadly. Even if we did, we have plenty of other ways of determining bad sites, and we have plenty of other ways of assessing that sites are actually good.
We may do spot checks, but were not planning anything more broadly with this data right now. If a webmaster wants to shoot themselves in the foot and disavow high-quality links, thats sort of like an IQ test and indicates that we wouldnt want to give that webmasters disavowed links much weight anyway. Its certainly not a scalable way to hurt another site, since youd have to build a good site, then build up good links, then disavow those good links. Blackhats are normally lazy and dont even get to the build a good site stage. 🙂