Google is getting smart. And that is a good thing. Google used to chase after meta tags, have websites submit their data. They no longer need to do this anymore. Google has bots that crawl through sites and eventually discover them either through link exchanges, clicking of links, sites with Google Analytics discovering links from other areas either from or to these websites. Google will eventually find your website. Whether your website is relevant or not.. has now become part of the Google algorithm.
Google noticed something a while ago. Something that was never meant to be. A search query returned all of the same content on the first page. Each page had copied content on it and yet all of these pages were seen as relevant enough to be on the first page. In reality, none of these pages contained any useful information or only one contained it, because all the pages were the same.
Keyword, Tags, SEO and many other tricks to get Google to put your page first became all too easy, but that did not mean that these pages pulling these tricks were actually relevant, and this led Google to come up with an algorithm to defeat these websites. There are several algorithms at work with each query on Google. And Google is only getting millions more bytes of data each day with query the world is searching.
Google’s new algorithms began eliminating these websites, deemed duplicate content. Unfortunately, there was some collateral damage and some websites that did not contain copied content were hit. They found their websites no longer appearing on Google at all. Google states they are trying to improve their algorithm, but those sites now must find a new way to exist and be seen.
Google wants original content. It is very understandable. It is very annoying to search for something and get the first few pages of results giving you the exact same content. If you do not find what you’re looking for within these first few pages, even if the information does in fact exist, than the authentic information is lost forever. If there could be some way to eliminate all the websites that have no relevancy, than what Google is doing is completely understandable.
This guide was meant for honest webmasters who do have original content but may have somehow messed up somewhere along the lines.
A while ago, my site received a penalty for duplicate content. It went from 250 visitors a day to 30. I checked Google Webmaster and realized I was ignoring all the suggestions on the HTML page. However, this page of “suggestions” were not suggestions at all. They were warnings. And if nothing was to be done about them, than Google would penalize the website.
So here are several steps that can be taken in order to recover your website from any Google penalty.
1. If you do not have Google Webmaster (http://www.google.com/webmasters/), sign up for it. Upload your sitemap.xml and ensure that a Googlebot has crawled through it. If not, give it several days. Use the Fetch as Googlebot under Diagnostics.
2. In Google Webmaster, click on Diagnostics and check your HTML Suggestions. If you have duplicate content, remove it. Do as many of the suggestions as you possibly can!
3. Validate your CSS. Go to a site like http://www.cssportal.com. Validate and optimize your CSS! I went from 316 errors down to 17 after I was done. It took me about an hour manually to do everything. There is an automatic optimizer, but be careful and make sure you back up your CSS file! The optimizer may change things, especially when you begin to check for Cross-Browser compatibility! So I suggest you run the automatic optimization, but make the changes manually.
If you do not already have them, depending on your web browser.. get these tools:
- Google Chrome: Inspection, Refresh CSS
- Firefox: Firebug, Refresh CSS
They will save you a lot of time because you can see changes and make them before you actually change anything with the Inspection/Firebug tool. And then once you make changes, often times, browsers will cache the CSS and it will not always refresh with an F5 or refresh button. The refresh CSS plugin will show you immediate changes.
4. Write your own content! ONLY WRITE ORIGINAL CONTENT THAT IS YOUR OWN! In college, you were not allowed to plagiarize your papers. This is the same concept with Google! So get rid of anything on your site that might be on other sites. The only way to bypass this is if you provide a link to the original content. This may still affect your site.
5. REMOVE DUPLICATE TITLES AND META TAGS AND DESCRIPTIONS! If you have several pages on your website or thousands, and they all contain the same exact Title, Google will read these pages as duplicate web pages on your site and your site will receive a harsh penalty.
6. Do not stuff your keywords. There are many tools that can detect keyword stuffing. Use keywords to explain your page, your website, but don’t over do it! And certainly don’t hide it with CSS or hidden text, because bots will detect this.
7. If you link exchange, make sure you’ve exchanged with good sites. Remove your site from Bad Neighborhoods and Link Farms if applicable. Use this tool to detect what sites are in bad neighborhoods, or contain links to them. If they link to some bad neighborhoods, this will affect your website ranking greatly. (http://www.bad-neigh...t-link-tool.htm)
8. Creating authentic content in subdomains and making those subdomains popular visitor areas may help your site.
9. Understand how SEO works, specifically Google SEO. (http://www.seobook.c...eral-damage.php)
10. Stop putting all your eggs into one basket! Utilize all your resources!
- Yahoo Site Explorer, Bing Webmaster Tools, Twitter, Facebook, MySpace, Digg, StumbleUpon, and every social media tool you can think of to help your website!
Google is not the only resource out there. There are so many other places you can help to recover your site and get more visitors, even in the eyes of Google, especially since your site is being seen through eyes of popular websites.
These are my own suggestions and things I’ve come across from researching. I hope these suggestions help everyone to avoid what I went through.