- negative seo doesn't work, google can tell the difference between spam links/sites
- it's probably an issue with your own site (ie. the deranking is justified)
- changes in rank are likely due to an algorithm update
I'm not sure if the mods on that forum are even google employees - they seem to just be random users, not sure how they are selected. I've never seen anyone on that forum acknowledge that negative seo is effective.
They have a mob of high-ranking posters upvoting each others' replies, patting each other on the back and ultimately ganging up on anyone who doesn't accept their replies.
We had an issue with a site incorrectly flagged as carrying malware. Submitted multiple review requests through the Webmasters console, all had no effect and produced no replies. Completely baffled by the situation, posted to GWF. GWF top rated reply and their "general consensus" was that we were morons who don't know our way around basic server administration.
A couple of weeks later we discovered that the Webmasters' console was plainly broken in the browser that we used to submit review requests. You'd click on "Submit", it would go "inactive" and the page was reloaded in a bit, but nothing got sent out. Re-submitted the request with a different browser, it worked, issue got resolved in a matter of hours, but the GWF interaction left a very bad aftertaste. F-, won't do it again.
That position is utterly silly to me.
How can Google defend the idea that "blackhat SEO" you do yourself will tank your site, while saying others doing the same exact thing to you won't work?
Are they claiming some magical ability to discern ownership and intent?
I have no strong opinion whether that claim is true or not in practice, but that's what's being implied.
I have seen unintended negative SEO a site got hacked and thousands of pages of spamy porn links added.
I suspect its harder these days but Google isn't perfect - they are only human and they make mistakes just as well all do sometimes.
A similar process was laid out on the recommendations for doing a 'disavow' thing with google webmaster tools - the amount of time all this takes is not trivial.
Then, it happened again two months later, with a whole 'nother set of bad links from crappy sites.
Since then, it pretty much happens about every month. a hundred or so new sites with crappy, likely expired or expiring domain names, where many appear to have names that would of once been used in the past for link ring seo - and likely were found to stop helping their clients sites and instead starting hurting - so now links are put towards ours.
So it's about a 50 - 100 new ones every month. I've submitted so many take down requests and disavow lists - and the dumb specific utf formatting that they require - ugh. It's a mjow time suck - and with no way to tell what is helping or hurting.
I suppose if your site was attacked once and left alone it would be worth the effort, but if someone is serious about taking you down, the effort is better spent not trying to please google and instead focusing on better places to get found like snap, insta, fbook ads, etc.
admit - my experience small data point, ymmv for sure.
# please let webmasters only get links counting when they approve them! # This would stop bad seo AND negative seo right? # webmaster console can have option to choose only to count links that # are approved in the console – if it was piad for bad seo and they approved the # links then that would be proof. If it was negative seo attack they would not be # approved and not count against.
Not only would this stop negative seo attacks, it would also make it explicit if a website was trying to use shady seo - only manually approved links would count - so if someone checked off 300 comment links and 100 wikis or whatever, there would be no doubt the intent..
It would make shady seo and negative seo harder, and make it easier for those getting attacked.
Today it also hit me that if this went through, it could prevent google bombing miserable failure and such perhaps as well - and I can see reasons that could be considered good, and reasons others might not want to make that option-able.
I read a while back that googlers most likely won't be reading any of the disavows or info contained within, but I get the same feels from posting in the webmaster forums - that purposefully making it so no one knows if google has seen the posts.
I'm sure there are good reasons for that, like secrets of the algo, and legal reasons for avoiding things... but it's really hurtful to so many.
One of the last times I get into a thread, after research from some 'top posters' or whatever they are called - they just said something like 'with some key phrases there is and has been so much spam and so many different spam techniques that they just freeze the results and put a few in the top and the rest far down. So there is no hope of getting those changed.
We have more legitimate questions about certain things, and debated in house whether using a bunch of google's things (analytics, fonts, tag mgr, all the things)- we might do better - that seemed not fair at the time, and then what a few weeks ago someone posts on HN abour seo and one of the main things is 'use all of google's scripts' advice..
I becomes a conflict of interest I think, and a conflict for our users and their privacy with sensitive subjects - yet others are willing do that all day for the G traffic - and they get top results.
It's been a funny (and not so funny) thing at the home office seeing a top sex result running google ads. There are so many conflicts when trying to do certain things with google - and trying to do what's right is not what is shown to the world as working, it's frustrating.
I suppose a big goal for the g spam team has been to try to make figuring things out for seo people really frustrating - and I have seen the result watching so many others explain how they did good results for a while then kicked out.. and not just bad links, but having good content.. so the goal of frustrating so many has been achieved.
On the flip side, google publicly has said for a long time, make a good site with content the visitors want to enjoy - don't buy links, and you will do fine the results. Well this does not seem true today and has not for years now.
again, small data point, ymmv
Edit: I don't see this approach having a ghost of a chance to work and it seems ignorant of the situation described in the OP.
For example, a lot of popular podcast platforms will wrap your notes on their public crawlable site but don't set a canonical URL back to your site.
From Google's POV this must look like duplicate content.
0: https://www.google.com/search?q=watch+spider+man+online+free
Does that second option make any sense?
It's not uncommon for authoritative content to be plagiarized and one needs to be ever vigilant and be prepared to take direct action rather than relying upon the wisdom of google's algorithms (which seem to be almost wholly ineffective in terms of discriminating against such practices).
No, because the hacked sites only display the fraudulent content to web crawlers, and explicitly hide it from legitimate users.
It's such a specific quotation that there aren't that many sites that have that text, so the spam sites are gonna appear.
Now if those sites outrank his site for regular search terms "what is a polygraph" ... then that's a problem. But he didn't seem to indicate that's the case.
> With the text of my homepage replicated across hundreds of such modified pages, it has essentially disappeared from Google for key search terms for the site, such as "polygraph."
on brief glance, it doesn't look like he optimizes for "polygraph". He does rank #1 for antipolygraph.
The attackers seem to have achieved that goal.
It's annoying that it can happen. If someone wants to hurt your brand they can do it in more ways.
Spam Reddit till a domain get blocked, buy cheap backlinks on Fiverr etc
I'm not sure how well google is responding to negative SEO.
Could someone explain this bit to me? Don't follow it.
2. When a search engine crawler goes to the site it displays the attackers content.
3 When a ‘normal’ users visits the site it will display the original site or a generic marketing site.
It’s a tactic to prevent the original owner from knowing their site was hacked.
Everything was the same as you stated. Just difference was users coming from google were redirect to some site selling Microsoft keys.
I tried to research it more but since it was related piracy I didn't want to get involved much. But still couldn't find any solution for it.
They were doing that 15 years ago and it was causing headaches back then...
Faked pagerank 9 domains used to pull a pretty penny on ebay...
That is, who could possibly benefit from hiding anti-polygraph sites? Why, only all the entities in the world who uses the polygraph to intimidate people; i.e. all the world’s collected military and intelligence agencies, plus any really large corporations either with ties to them or who uses the polygraph themselves.
I don’t think he can expect any help from Google.
Seen 3:24pm
Google surpassed Altavista because it fundamentally bought a better way of indexing with it, and survived to dominate because it paired that with an incredibly insightful path to revenue.
Are memories really that short? (Am I really that old?)
We can argue around the edges, but I think any modern search engine user would be amazed at how bad bad bad search engines were for the basics of answering your search query, before google.
It wasn't a full search-engine, although it looked like it.
It was more like a very popular demo.
From the outside view, you took your own website (and domain you own) and began low quality black hat SEO.
It's like expecting Microsoft to step in when somebody got into your Windows computer by putting in correct password.