Last week we talked about how the new Penguin 2.0 update has set out to target web spam. However we didn’t go much into detail of what exactly Google is trying to eradicate from its search results.
The so-called black hat SEO tactics that you’ll hear Matt Cutts talk about essentially means any deliberate manipulation of search engine’s algorithm. Most web masters live and die by their google ranking. Which has seen a host of folks aggressively pursue techniques like keyword stuffing, linking building schemes and using automation software for comments. All of which, plus a few extra, I’ll go into more detail below.
As I mentioned in last week’s post, Google wants to improve user experience when people search the web. And its latest algorithm update has so far punished porn sites, game sites, and big brands like Dish.com & The Salvation Army
So why does Google want to get rid of the spam? As you’ll see from the different tactics below, it’s just plain deceitful for web users and delivers no value to anyone except for the person responsible for the content.
Let’s take a deeper look at some of the SEO techniques Google frowns upon.
Black Hat Search Engine Optimization
This is basically filling up all your content with keywords, in the hopes of convincing search engines of the website’s relevancy. You’ll easily spot this tactic when find yourself onto a page where you can’t make any sense of what’s being said.
Hidden or invisible text
Using hidden or invisible text is similar to keyword stuffing however the reader cannot see it. For instance, it’s when a spammer puts white text on a white background and becomes invisible for the reader, but not for search engines.
However not all hidden text is considered bad when it’s used to describe content the user cannot see due to a slow connection. One common example is the use of Alt tags in images, where the text is hidden from the user but is used to describe what the image includes.
A doorway page is easy to spot. Typically they take the form of a standard template and focus on specific keywords or phrases. The goal with this type of page is to get people to click through to the target destination.
It’s no secret that Google loves content and it’s one of the main factors driving up your search rankings. Yet some people have forgotten that the content Google adores so much has to be unique.
Scrapper websites simply publish other people’s work in the hopes they’ll start ranking highly. But the reality is that it doesn’t, since it doesn’t deliver any additional value to your readers.
Over at their Webmaster Tools page, Google suggests that you;
“Take the time to create original content that sets your site apart. This will keep your visitors coming back and will provide more useful results for users searching on Google.”
Remember when your SEO strategy was all about getting links? Well some people took it to the extreme and started using automation software for building links back to their sites.
The reasoning behind this is that in the very beginning, Google calculated a website’s credibility and popularity based on how many back links they had. Nowadays, Google uses other criteria to judge a website’s relevancy and looking for natural linking patterns back to the website.
A legitimate use for URL redirection is when you’ve set up camp on another domain and want to send visitors to the new website.
And then there’s the other side of the coin – when URL redirects are used to trick search engines and online users with different content. In some cases the search engine will only index the original website and does not follow through to the redirected site.
For instance, I may be searching for new football boots and find a result where someone is sealing football. I would then click on the search result, but because of the redirect, I get to a gambling website.
Here we have a technique whereby search engine bots are shown content that’s different from what readers see. The way this works is via IP addresses to help identify bots from users.
The reason cloaking is against Google’s quality guidelines is that it delivers a horrible experience to online users. In a YouTube video, Matt Cutts gives us an example of how cloaking can be used:
“When Googlebot came, the web server that was cloaking might return a page all about cartoons. But when a user came and visited the page, the web server might return something like porn. And so if you do a search for Disney cartoons on Google, you’d get a page that looked like it would be about cartoons, you’d click on it and then you’d get porn.”
You’re probably thinking why these types of websites exist. The main reason is usually for the owner to increase advertising revenue by driving more people to their website. Another reason is to get folks to participate in a scam and or obtain money through other means, such as gambling.
As you’ve probably noticed from the descriptions above. These black hat SEO techniques deliver a horrible experience to online users and are just plain deceitful.
To make sure you conform to Google’s quality guidelines, Matt Cutts suggests you ask yourself this very simple question, “Are you doing anything that treats Googlebot differently from human user?”
Photo by John Ellis Updates