After Joomla’s recent security issues, people have been double-checking their sites. In some cases it’s easy to tell if your site has been hacked (the large Turkish flag and blaring music are strong hints) and on other occasions, the hackers might leave no trace.
One of my colleagues found a very subtle hack … his robots.txt file has been altered to block his entire site from being indexed by Google. The hack had been in place since June, causing him to lose all his rankings. It’s likely that this was a highly motivated rival rather than just another group of script kiddies.
Is it possible to defend against these subtle attacks? In this case, yes.
How to Track Indexing Problems Daily
- Register your site with Google Webmaster Tools
- Go to Tools >> Gadgets
- Select “Crawl Errors”
- Repeat with all your important sites.
- Visit your iGoogle.com page daily. You’ll see any crawl errors as soon as Google does. In the case of the hack mentioned above, the “URLs restricted by robots.txt” would jump to 100s or 1000s: