Why Does Google Penalise Certain Websites?

Ready to rank your Brand?

Why Does Google Penalise Certain Websites?

automatic penalty

Google uses various algorithms like Hummingbird, Panda and Penguin to evaluate a website. Google keeps updating its algorithms to offer the best results to the users. When an update is rolled out it may lead to change in the parameters and it can result in a penalty from Google for some of the websites.

Google uses two penalties: automatic and manual. An automatic penalty occurs because the new algorithm is applied automatically without notifying the users.

To know if your website is penalised automatically you can check if your ranking has dropped suddenly, page one positions are slipping to page two, the website is removed and running a site search yields no result.

These are the reason as to why Google will penalise your website:

duplicate content

1. Google values originality and so duplicate content is penalised. You should not use manufacturer’s text and information to sell products, duplicate images and tables, description of an image file should be different than the original and more.

2. Using too many keywords can be harmful for your website. Google will think that you are over using the concept for a position which is considered a doubtful practice.

similar links pointing

3. You should not buy links from other sites to link to your website. Websites that create pages only for linking are not acceptable. You should not exchange a lot of links. Too many similar links pointing to the same place is considered automation. Rented links are also considered paid links.Google values links that are acquired naturally.

4. You should not add excessive ads on your website as this will worsen the user experience. Google may penalise you for excessive use of ads on a page.

navigation aid

5. Footer links are used by web designers as a navigation aid. Some try to pass link juice unnaturally through these links. You should avoid this, especially while acquiring backlinks.

6. All the links on your website should be visible. If you hide things then it is considered suspicious. You should not use the same colour for the link and the background or button. Do not use hidden content either to manipulate the keyword weight or the theme.

7. Do not use content from other websites as Google sees scraped content as duplication. You should create original content.

Penguin update

8. Do not overuse anchor linking as the Penguin update does not allow this.

9. You should not use any black hat methods which Google does not like for publishing content as it is considered an act of manipulating the SERPs. Spinning a content is considered content theft and you should not buy such content.

10. If your website gets hacked then it will be immediately removed from the SERPs by Google. Ensure that your website is hosted on a secured server and enough measures are taken to keep your website secure.

domain names

11. You should not over optimise your website with keywords as Google sees it as an effort to game their algorithm.

12. If your website goes down then Google will prefer de-indexing it rather than send visitors to a dead end.

13. If you use domain names with keywords then there is an anchor text linking issue. If you link to the domain continuously then Google may see it as anchor text manipulation. If you use the exact match domain then Google will think you are fooling people into clicking.

experienced content marketers

14. Blog networks or any other website network used to link building are considered to be potential SERP manipulators. You should delete all the links coming from such networks.

15. Comment systems have an automated spam detection system but you should still check your comments. You should not let any spam build up through comments. You can even switch them off if you cannot monitor them.

16. If your website is submitted as a source of spam using the online spam reporting form provided by Google, you may get a penalty. This may be done genuinely or maliciously.

Robots.txt file

17. When there are too many signature links in forums that you cannot locate the actual posts then there may be a penalty. Use good natural linking methods and make them ‘nofollow’.

18. The Robots.txt file should be used to tell the search engine what to access on your website. Excessive blocking using this file can cause a penalty.

19. You should not link to suspicious, malware, hacking or porn sites. Also remove links from the sites that have been penalised before.

single page websites

20. You should not create single page websites which are optimised for one main keyword and funnel users to another site. You should not use multiple low quality, thin landing pages to improve your position in the SERP. Google considers it a bad practice.

21. If you get a penalty then you should not use 301 to redirect it to a new location or domain. Even if the redirect is removed the penalty will still be there.

content reference a content

22. Google has waged a war against the words that frequently appear in spam websites. You should not use such words if you are in an industry which is rife with spam.

23.Google likes that a content reference a content of similar standard. If you do not follow this then it may appear that you are trying to attract traffic unnaturally.

24. Instead of buying a domain which previously had a bad history then it is best that you buy and invest in a new domain. If you must buy an old domain, Ensure that you check backlinks history using Ahrefs / SEMRush and also check website history using WayBackMachine.

Google Webmaster

25. Do not buy content for a content ‘farm’ as they are defined as sites with low quality or shallow content. You can employ professional writers.

26. You should not use any quick fixes against Google Webmaster Guidelines to rank higher in the SERP. You should not employ anybody who promises quick fixes.

Other considerations:

404 errors

27. Google does not like 404 errors as the users do not get what they are looking for. So you should make sure your website does not have them. If your website has too many of them then you may get a penalty.

Google does not like errors be it 302, 500 or others. So you should not have them on your website. You should also not have too many broken external links.

28. If your site has missing meta descriptions then it will not be able to appear in the related search results. Do not use duplicate metadata as it can cause a duplicate issue. Also do not use more than 5 meta keywords per page.

XML Sitemap

29. Header tag (H1) is used to tell what the page is about. If you use too many H1 tags on your website then Google will get confused.

30. All your images should have Alt-text as it allows images to be ranked in image search results. Google cannot read images and not using optimised Alt-text can hamper your rankings.

31. XML Sitemap is used by Google to understand the structure of your website. You should create an up-to-date sitemap of your website and submit it in Search Console for better results.