Words = Money
Sign up HERE for my FREE Article Marketing Newsletter
Internet Marketing Articles by Mike Liebner.
Real Content is the Answer by Mike Liebner
A spam banning frenzy is in full effect - DE-EVO-lution rules the day. Now, it's more important than ever to run away form serving up junk and instead focusing on building quality real content web sites!
Oh no! The human reviewer has got your web page in their hot little hands and is going to read your words and look over your web site! Sounds like I am talking about the DMOZ Open Directory or the Yahoo Directory, but I am talking about Google and Yahoo!
The Search Engines are on a RAMPAGE!
It’s no secret, Google, Yahoo and even MSN are banning web sites in record numbers and it's not just with the help of filters and algorhythms. In their quest to offer only the best high quality search results to their visitors, the search engines are adopting directory-like human reviewing techniques!
Search engines now have employees whose primary job is to surf the web looking for spam and for them to push a button and ban every web site they come across that they feel should NOT be on the first page.
Mainly at the top of their lists for human review are web sites that get complaints (usually Adsense type sites) alerted by web forms on the search engines that allow do-gooder surfers to easily report web sites and point to them as spammers. Some people even volunteer to rat on spam sites. And if that’s not enough, there is a good chance your competitors will put your sites through the ringer and if they uncover any spamliness they will rat you out.
Actually, it's very likely that it is the competition that is fingering the sites to get banned as they may be angry at losing the top position to a competing marketer. The only solution is to play by the rules! Wait - there are no published rules... just a mantra that says do no evil...
But what about the people who make web sites with good intentions? They're not evil. They just don't keep up on all the top talk about this sort if thing and they make mistakes like repeating keywords excessively.
And SEO aka Search Engine Optimization - I get the feeling that THEY (Google) think this is EVIL. SEO optimizes pages for better rankings. We should not do that - should we??
Well - it's not immoral or unethical so I'll go up against the big G and say YES we should do those things... but we need to be aware that some human reviewer may not agree with us and push the button and banish us! Ouch! That hurts!
It’s just not safe to employ borderline techniques these days. Overly hyped up pages with too many ads or JUST Adsense may appear to be strictly commercial. That in itself is not bad, but they want encyclopedias to rank on top too. Not just commercial pages.
The reason is really pretty obvious if you try to see things from the search engine’s perspective.
What’s happening is that there are so many billions of web pages out there that the search engines are having a harder and harder time of indexing and cataloging all the web pages their crawlers are spidering. Combine the huge growth of the internet with the proliferation of webmaster software that can build thousands of junk pages at the push of a button, and you’ve got a bulging inventory of web pages.
Those machine generated junk pages are clogging the hard drives and servers of the search engines.
Free Blog: I Want My Free Site
The thing is that the people who are the customers of the search engines, have been complaining quite vocally that too many web pages that appear on the first page of the search results are nothing more than web pages with links to other web pages. Junk directories with scraped search engine results. Made for Adsense (MFA) sites that want nothing more than an ad click.
You know what I’m talking about.
In other words, the automated in your face commercial junk is making it unpleasant and often difficult for the search engine customers to find real content web pages.
So… to keep their customers from looking elsewhere, all of the search engines are now aggressively blacklisting garbage web sites and removing them from their indexes.
Webmasters beware, the search engines are cleaning house and the free ride is over. That means that the zillions of machine generated junk pages, that have become an epidemic, can no longer make webmasters money like they used to, if any money at all.
There is a better way though, and it actually has a long term profit stream attached.... real content web sites! Yes, why not build web sites with real content that will last the test of time? Why build junk sites that will bite the dust when you can spend time on things that last??? That's the ticket! Build sites that last!
New article! eBay Affiliate Program Partner Network - register and make money with eBay!
Web Sites Need Quality Backlinks by Mike Liebner
Did you know that search engines will give greater weight to a web sites backlinks - ie text links from other web sites - if and when those links are on domains hosted on diverse IP addresses.
Hmmm... that makes sense you know as in the real world a popular site has links coming in from all over the globe.
The quickest way to tell the search engines you are linking to your own web sites is to link from a site that is located on the same IP address. So, if all your domains are hosted with one host on one server there is a great chance they all share an IP or a block of IP addresses in the same class c block.
That can spell TROUBLE and trigger filters if you link among your own sites.
OH NO! I want to link to my own sites!
Check out Nurfo
Fortunately, I found that there were web hosts offering dedicated servers for much less than I was paying and I adapted and spread my domains over 3 new servers with clean IP addresses. You see, the search engines, and Google in particular, had become so sophisticated that they could figure out that servers had IP addresses that were all in the same class C block, and if too many links came from the same IP range, it signaled a red flag to go up (it still does by the way) and they’d ban the interlinking sites.
So the solution that worked then and works now is what I like to call IP Diversity. Make sure that if you link your web sites among themselves that you make an effort to spread the domains onto different servers with as many different class c IP Addresses that you can.
By class C, I refer to the third octet in an IP address. example 220.127.116.11 - the 17 is the CLASS C. So, if you had a domain on a separate IP address 18.104.22.168 - it would be considered the same class C IP block and would very likely trigger a red flag to go up if there was an unusually high percentage of your overall links coming from a particular class c block.
So, if you want to make sure your sites are ban proof and do not get dinged with penalties or worse yet BANNED for linking within the same class c IP block, make sure you spread your web site domains with different web host all over the globe. You see, it’s just not natural for a web site to have TOO MANY links pointing to it from the same IP class it is in, OR from too many web sites within the same class c ip blocks.
If you really want to study this further I suggest checking out the excellent OptiLink software, or SEOElite, which can show you what IP Addresses the domains that are linking to you have. If you run one of those missions it will become clear to you when you see the IP addresses. So, if you want to make your sites ban proof, be sure to consider IP Diversity whenever linking to your own sites, or when trading links en masse with other webmasters.
If you can't afford a dedicated server, set up new free hosted accounts at wordpress or blogger and also get a free cheapo hosting accounts that go as low as $4.95 a month or so. That'll give you some IP diversity and allow you to be a bit more in control of your linking.
Employ Reasonable Keyword Density - Avoid Hidden Text
Have you been banned lately??? No, I didn’t say hugged, I said BANNED as in blacklisted!
Search engines are so committed to cleaning up their indexes that in the frenzy to remove sites, even clean and borderline sites are getting banned. Hopefully the information presented in this article can help you prevent your site from even being considered for deletion.
As an internet marketing veteran of over 10 years, I have experienced a large variety of business changing phenomena’s. I have watched search engines come and go, as well as techniques to get free search engine traffic change and evolve. In this series of articles and in my Special Report available at my web site ArticleUnderground.com, I’ll share some of the secrets that can help you stay in the game and prosper for a long time!
Amazon Sllers Make Money! Look at Amazing Selling Machine Amazon Videos
Back in the day, all it took was figuring out the exact formula for keyword density and I’d be able to optimize a dozen or so of my pages to dominate the search results. The competition wasn’t so great, so I could easily get top ten rankings for keywords that got lots of traffic. It was a fun game! But as with all good things, the word got around and soon everyone and their mother were building web pages and trying to get that free search engine traffic. Each webmaster was studying what the other webmasters were doing and were trying to one up them and leap frog to the top of the ranks.
Increased competition and the demands of the marketplace required that the landscape must change.
One of the first things that the search engines figured out was that spammers tended to stuff their web pages with hundreds or even thousands of hidden keywords. Sometimes referred to as keyword stuffing, it was common to find web pages with blocks of hidden text at the bottom, mixed in with the same color of the web page background.
Search engines figured out text and backgrounds shouldn’t be the same color and developed algorithms to catch web pages using this trick.
When it became common knowledge among webmasters this hidden text keyword stuffing technique could work against you, webmasters found clever ways to NOT hide the text (similar colors as background, but not exact) and still stuff the keywords.
These days, the search engines simply don’t rank pages with too many repeated words and it is generally agreed upon by SEO’s (search engine optimization experts) that the ideal density for a keyword phrase should be between 1% and 3% which is within the reasonable amount that should keep your web page from the scrutiny of a human reveiwer.
I’ve provided webmasters a great free tool to measure the denisty of your web pages including 2 and 3 word phrases. It’s called the Article Underground Keyword Density Analyzer Tool, or AUKDAT for short! It’s available online at http://www.articleunderground.com/webmaster/keyword-density-tool.html It’s the only density analyzer I am aware of that actually measures 2 and 3 word phrases!
Since search engines are now employing “human reviewers” it is also wise to make sure that your content reads naturally and does not appear to be artificially forcing a keyword phrase down the readers throats.
If you want to play it safe, make sure your web pages do not contain obvious or blatant repetition of keywords and that the phrases fall into the 1% to 3% range. A little less, or a little over should be ok, but as a rule, shoot for that range.
Scraped Content Sites Do Not Last the Test of Time (Build Real Content Sites!)
Real content is definitely the way to go! There is no need to get greedy and mix in risky scraped content. It's just not worth losing it all!
Stop calling me names! I am not a "black hat"! Hey! I'm only human! Cut me some slack! I'm sorry but I could not resist the temptation to add some scraped content pages to my highly successful music web site! I had no idea it would get banned by Google!
Oh well... sometimes you learn some lessons the hard way!
And the lesson is:
NEVER use scraped content on a REAL CONTENT web site
Never ever use "scraped" or “borrowed” (some say stolen) content on a site you do not want banned. It's just not worth taking a chance that a good site will go bad and get banned.
I personally have lost several of my highly popular and successful high PageRank hand made real content web sites because I made the mistake of including a handful of pages with scraped search results. I'm not even talking thousands of pages, just mere hundreds... but they WERE scraped and I paid the price.
It’s not worth risking your legit sites position on Google by including any "unauthorized" content. I regret adding the scraped search engine directory style pages (often referred to as Portal Pages) because the amount of traffic the already popular sites lost was significant.
Trust me, if you have a successful site, don’t ever use scraped content on it.
Google wants to provide relevant results. Can you blame them?
Google re-defined the role of the search engine to an enamored public, who became infatuated with it’s spam free results (less spam at least). Google also had a tremendous impact on SEO’s and internet marketers who had to adapt their businesses to harness the power of the free traffic that the monster Google could provide. I have to admit for a short period I was sleeping and didn’t spend the necessary time adjusting as I should have, and when my business earnings dropped to an all time low about 3 or 4 years ago I had a massive wake up call.
Link Popularity and PageRank are the new standard
PageRank became the new standard for Google to rank web sites and it based PR on a formula that was determined by how popular a web page was. The more external links from other web pages with high PageRank to a page indicated this page was relevant and popular and therefore Google considered it as important. While they appeared to value lots of links, they seemed to favor links from other high PageRank pages. You see, pages could pass along PageRank to other pages. Web sites that had higher PageRank would have an advantage and would in most cases rank higher than similar pages that were not as popular.
PageRank is PAGE SPECIFIC, not SITE specific. Each page has different PR.
While not as important as external links, internal links too result in a site passing PageRank. If the pages have proper linking, the internal pages can even focus power to a small set of pages, almost forcing increased rankings for the text linked on those pages.
As with anything, the webmaster community figured out that lots of links to a web site could boost the rankings and link farms and linking schemes grew in popularity. Also webmasters began to buy and sell links based on PageRank.
In the case I cited above, I added a directory of around 200 machine generated pages to my popular music site for the purpose of trading links. Since the directory menu was linked on every page of my 600 page site it obtained it's own high PageRank. The pages had scraped content on them and I simply added links from partners to them. It worked for about 3 months and then suddenly the home page went from PageRank 6 to 0, and despite being in the index, not more than a dozen pages remained indexed.
My daily traffic dropped from 3,000 to less than 200 visitors a day. It was NOT worth tampering with a successful formula and the result was catastrophic, all because I got greedy and added those portal style directory pages with scraped search engine content.
I learned my lesson. Never ever mix in junk content, such as scraped search engine results onto a real content site. It'll likely get that site banned!
Banned sites are "Bad Neighborhoods"
While Google was the pioneer, other search engines like MSN and Yahoo were also employing similar methods of ranking sites based on links. Linking was the new game for webmasters seeking to increase rankings.
Like many others earlier in the game I linked all of my sites to each other to spread my traffic around.
Not long after I noticed that Google had actually banned and blacklisted some of my ever growing stable of web sites.
It appeared that Google banned my sites for no other reason than my sites were on the same server, and noticed that I had been inter linking my own sites together. I wasn’t trying to game them, I just used the power of my own traffic to advertise my own web sites to each other. But obviously Google placed so much importance on “link popularity” and as a factor for determining PageRank, that they felt it was “un-natural” for my sites to have so many links from the same network and as such dropped my domains from their index.
My sites were banned and then became “bad neighborhoods”.
Yes, when a site is banned Google will penalize other sites that link to the banned site. So as a general rule, make sure to never link to other sites that are NOT indexed by Google. A filter could trigger a red flag and they’d look at your site because it linked to a “bad neighborhood”.
How severe the penalty is, we do not know for sure, but as there is no upside to linking to a site that has been banned by Google, why take a chance.
As I learned more about the importance of links, I needed to find ways to prevent more sites of mine from getting banned. I had a problem because all my sites were on one big jumbo server and even though they had separate IP addresses (20 of them) they still were all in the same class C IP block.
At that point I was determined to find a way to stay in the game and prevent more sites from getting banned, and as with most obstacles to making money, I found solutions and continued on.
If Link Popularity and high PageRank was the key to higher rankings, I needed to find a way to play the game yet not break their golden rules.
At that point I discovered that the prices of dedicated servers had fallen from the obscene amounts that I had been paying for my one beefy dedicated machine. As a high traffic internet marketer I was spending $3,500. bucks a month just to keep one server that wouldn’t constantly crash because of the massive spikes I’d have in traffic. After all, a #1 ranking for a popular search term like the single word “music” or “money” or “free” could easily bring in 50,000 unique visitors a day. Get a few top tens for huge traffic keywords like those and even the manliest of servers would crash.
But what good was having a robust server when I wouldn’t be able to link my own sites with each other???
As it turned out I was able to buy a couple of servers for far less than I was paying. This enabled me to spread out my domains so they were not all on the same host. The advantage to this is that I could be less worried about linking my sites together and possibly even more important, I was able to reach out to other webmasters and create custom link exchanges offering them links from sites on several servers. This is a great incentive and helped me get some good quality links improving the link popularity and PageRank of my web sites. If you have more than a handful of sites, I suggest you consider spreading them on different hosts and perhaps even dedicated servers. That way you'll stay one step from the risk of your domains becoming bad neighborhoods!
Datafeeds as Blog Content - Squidoo and RSS Feeds - Article Underground Reviews
© 2011 Words Equal Money