Recently, Google’s Principal Engineer, Matt Cutts, discussed a new Google algorithm change aimed to eliminate webspam. Webspam is content that is scraped from a number of other websites and published on a different website in an attempt to attain higher search engine results. We’ve all seen these websites in the search engine results; we click on the result only to find out that the website is full of junk content.
This change is intended to clean up search results, but it could mean that search results become over sanitized. It could end up punishing your website in the end. If you’ve ever read another blog post or article online and want to share that content with your network, the tendency can be to repost that content directly on your website. While certainly not as beneficial as publishing new content and making minor references to the original content, a moderate amount of duplicate content has not usually been punished.
Unfortunately, Google’s new algorithm could mean that your website is marked as spam. In the past, Google has been criticized for being too lenient on webspam. With the new algorithm change, Google will surely heighten the sensitivity of the webspam filter. If your website is mistakenly categorized as webspam, not only will your harmless duplicate content be banned from Google, but your website runs the risk of Google banishment as well.
So how can you avoid the frozen tundra of Google Spamland? If you find some riveting content that you just HAVE to share, resist the urge to copy and paste into a new blog post. Rather, find a couple of key quotes and build the article around those quotes. Or, create a link to the original content and reference it that way. Just make sure that there is enough new content to signify to Google that you’re not a Google Scraper.