Username: Save?
Password:
Home Forum Links Search Login Register
    News: Keep The TechnoWorldInc.com Community Clean: Read Guidelines Here.
Participate in the fastest growing Technical Encyclopedia! This website is 100% Free. Please register or login using the login box above if you have already registered. You will need to be logged in to reply, make new topics and to access all the areas. Registration is free! Click Here To Register.
  Show Posts
Pages: 1 2 3 4 [5] 6 7 8 9 10 ... 1109
57  THE TECHNO CLUB [ TECHNOWORLDINC.COM ] / SEO / Banned from Google and Wondering Why? on: August 29, 2007, 03:04:03 PM
Banned from Google and Wondering Why?


There are those that get on the computer one night and find that all of their Web pages have disappeared from Google. While, others are still in the search engine index, but don't rank high for nothing, not even for their Web site's name. It's a Web site owners worst nightmare, getting kicked out of the search engines.

Ultimately, many webmasters had little or no warning that this was going to happen. Many webmasters are left with no idea why they were kicked out and are left wondering how to get back in Google's search engine. There could be any number of reasons why a Web site is banned by Google. The most common reasons for being banned are listed below in this article.

1. Duplicate Content
This is when multiple Web pages have the same content. Usually Google will just give a penalty to the Web page for this, where the page won't rank very high for the keywords in that Web page, but there have been cases where complete Web sites were banned because they had to much duplicate content. You should make sure there is no other Web site using your content.

To check for duplicated content simply search with unique phrase on your Web page. If you find a Web site that has stolen your content you should contact the site owner and tell them to take it down or face legal action. Also, for copyright violations visit www.google.com/dmca.html and notify them that someone is infringing on your site's copyright

2. Cloaking
This is Web pages created just for search engines, where it delivers one version of a page to a Internet user and a different version to a search engine. Cloaking Web pages are created to do well for particular keywords. There are various ways to deliver cloaking Web pages. Each search engine's spider has an agent name, the cloaked page is than only delivered to the spider with the user agent name that was chosen.

You can also deliver cloaked pages to the search engines by IP address, but Google and other search engines say they can detect cloaking. There are other reasons to use cloaking, such as custom language delivery and geotargeted advertising.

3. Hidden text or hidden links
This is text or a link that is invisible to the naked eye on a Web page, but are seen by spiders. Search engines use to have a hard time spotting this technique, but now days you should avoid doing this because Google and other search engines can spot this easily. Even if a search engine doesn't spot your hidden link, a competitor might find it and report your site. Sometimes this can be done without even knowing, so you better double check each Web page that you have messed with in the past few weeks.

4) Keyword stuffing
Keyword Stuffing is when you load a Web page up with keywords in the Meta tags or on the Web page's content. The General techniques today for keyword stuffing are repeating the same word(s) over and over again in the Meta tags or on the Web page's content or using invisible text, as we talked about up above in this article. If the word is repeated to much it will raise a red flag to the search engines and they likely will place a Spam filter on the site.

5) Linking to bad neighborhoods
Bad neighborhoods are designed to increase your Web site's ranking or is Web site's using Spam techniques to increase search engine ranking. You should not link to any Web page that uses Spam techniques to increase ranking. You also should not join link exchanges that are designed to improve ranking or Page Rank. If you are not aware of linking to any Web site like this, you should check each outbound link on your Web site.

6) Buying links for Search engine ranking
This where a Web site owner buys links just to increase his or her ranking. This is also used to increase Page Rank. Google and other search engines still have a hard to detect this, but they are starting to catch on to this technique. If Google is aware of the site, they can just discount the Page Rank, so they can't pass Page Rank on.

7) Machine Generated Web sitesThis is a site that generates hundreds of web pages that are basically the same page repeated hundreds or thousands of times, but with a few unique lines of text and unique title. Often times, search engines can't spot this, if done right by the site owner. However, if a spider doesn't spot your machine generated Web pages, a competitor might find it and report your Web site.

What to do after you are Spam clean?Once you have cleaned up your Web site, you can try contacting Google by visiting http://www.google.com/contact/. Tell them that you made a mistake and won't do it again. Even if you do contact Google, they most likely won't let your Web site back in and if you happen to get back in, you better keep your Web site squeaky clean because I doubt you will get other chance.

If you can't get in touch with Google, I suggest that you wait for a few months after Google's spider visits your Web site and see if you get your ranking back or at least where you can see that your ranking is going up in the search results. During this time you should not change your Web site around much and give the search engines time to spider your Web site.

I really don't think that many Web sites have dropped because Google is penalizing them. Instead, I think Google has changed the factors or adds more weight to a factor(s) that they use to rank Web sites in the search results. All search engines make periodic changes to the way they rank Web sites in the search results, so don't be surprised if one week you rank number one and the next week you rank 30TH.

Matt Colyer began as a SEO Specialist in 1997. He founded Superior Webmaster in 2004 as a source of articles and tutorials for Web site owners looking to improve their Web site.
58  THE TECHNO CLUB [ TECHNOWORLDINC.COM ] / SEO / 7 Essential SEO techniques on: August 29, 2007, 03:01:42 PM
7 Essential SEO techniques


1) Title Tag ? When we're talking about SEO Technique, the Title tag is one of the best and most powerful tags that you can use. Every page should have it's own title tag, each tag should include the keyword that you are targeting, along with sub-keywords associated with the main keyword that you seek to drive traffic from.

2) ALT Tags ? These tags are most often ignored completely. Who would have thought that tags, that were meant to be for text browsers because the images didn't show in text browsers, would become part of a technique to raise the keyword density of a webpage. Your main keywords need to be included in some ALT tags- be careful, over-doing this could cause you to be banned and your PR could drop.

3) Link Popularity - Link popularity is the most powerful SEO tool, (arguably, of course). Typically a website must have a link to it, in order to be looked at by the spiders. This means that you should have at least 1-2 links going into your site from another website that has been visited by the spiders through search engines. Increasing Page Rank, depends on the number and quality of the sites that link into yours. Keywords should be in the links, along with sub-keywords. Keep the links short; this creates an easily readable keyword term for the search engines to recognize.

4) Keyword Density ? Research first ? Then decide on the keywords that you want to target. Don't over do it- you should use the keywords only once in the title tag, once in the heading tag, once in bold text, and then shoot for the density to be between 5% to 20% . Keywords should also be strategically located throughout your website. Place them in all locations. Several search engines will recognize bottom of the page or sidebar text/links and discount them, this is because many website designers will load certain sections with clutter.

5) Page Size - Try your best to keep your web page over 5k and under 15k in size. I just designed one that is close to 53k, this is mostly because of pictures and a flash file being used, rule of thumb?. Look at Yahoo! And Google, These are some of the fastest loading pages, and most popular, people don't like to wait ? even if you have "cool" graphics and pictures. Your web page's speed is important: robots will be able to spider your web page faster and easier.

6) Theme Rich in content - Search engines are looking at themes rather than just the words or pictures. Content is King ? I know, you hear it a lot, but it's true! Webpages should have content that's related to the targeted market. Link your visitors to other valuable and related content within your own site. The more pages the better. A good tool that I have found to show you linking info is: http://www.UpTimeBot.com You can preload upto 20 pages and then refersh them, it shows all the vital information, along with how many pages are indexed from each search engine.

7) Cross Linking within your Own Site ? You rely on other pages linking into yours ? in order to get indexed; the same is true for all the pages within your site. Any new content should be easily reachable through 1-3 pages of content. Linking back to your main home page, or a storefront will help you get indexed faster and achieve a higher Page Rank.

Scott Fish
President, http://www.TopSatelliteRadio.com
Personal Blog: http://www.ScottFish.BlogSpot.com
59  THE TECHNO CLUB [ TECHNOWORLDINC.COM ] / SEO / Most Overlooked Search Engine Optimization Technique on: August 29, 2007, 03:00:14 PM
Most Overlooked Search Engine Optimization Technique


Some of the basics when it comes to SEO and getting high ranking in search engines starts with using a keyword and then placing it strategically throughout the website in content, title, headline, etc. Using the keyword multiple times in your content is great, but lets look at something that a lot of websites completely overlook.

Simply put...

Take your keywords and current links within your site and USE them!

What I mean by this is, you have certain keywords that are spread across your website. You also have links going to other websites, affiliates, advertising, resources, etc. Combine your links with * Relevant * keywords and you will see an increase in page rank and traffic due to the increase in PR. Just take a few minutes ? scan through your site ? find links that may be the name of a company, anything that says "partner" or "affiliate" -- Change them! Make them say anything related to your main keywords.

Follow this strategy, it takes just a few minutes and this plus other simple strategies will add up to a good increase when your key words are searched for in search engines.

Some of the basics when it comes to SEO and getting high ranking in search engines starts with using a keyword and then placing it strategically throughout the website in content, title, headline, etc. Using the keyword multiple times in your content is great, but lets look at something that a lot of websites completely overlook.

Scott Fish
President, http://www.TopSatelliteRadio.com
Personal Blog: http://www.ScottFish.BlogSpot.com
60  THE TECHNO CLUB [ TECHNOWORLDINC.COM ] / SEO / Playing By Googles Rules on: August 29, 2007, 02:58:55 PM
Playing By Googles Rules


As the undisputable leader in search engines, Google places a very high importance on the quality and relevancy of its search results, especially now that the company is public. The know that in order to keep the shareholders and users of the engine happy, the quality of returned results are extremely important. For this reason, doing the wrong thing, purposely or unintentionally could result in a severe penalty or even get you get you banned from the listings. Below is a short list of ideas to consider when drafting your search engine optimization campaign.

Hidden Links

Link PR is becoming a hot topic among seo firms, however whether or not incoming/outgoing links still play an as important role as they used to, it still considered a "blackhat" technique that can and most likely will result in a ban or penalty from Google.

Hidden Text

Stuffing your pages with text to small to read, same color as the background, or using css to push the text of screen for the sole purpose of loading your pages with content rich keywords and copy will also get you awarded the same penalties as hiding links.

Page Cloaking

The practice of using browser or bot sniffers to serve the bots a different page then your human visitors may see. Loading a page specifically for a bot that a human user may never see will most certainly get you banned from the listings.

Multiple Submissions

Submitting your domain and pages thereof is also a big thing to stay away from. For example if I submit http://www.seohype.com and http://www.seohype.com/resources.html as two separate urls, I may be looking at a ban, penalty or at the least; a very long time before my pages get submitted. This is another reason to avoid auto submitters. Make sure you check if your domain is listed already in the search engine you are submitting too, if it is? move on to the next.

Link Farms

Be careful who and even what you are linking to. Links in to your site will not hurt, even Google knows you cannot control your links in. However you can certainly control what you link to. Link farming has always been a bad apple in Googles eye and should be avoided at all costs. Google also suggest that your own link pages should not contain more then 100 links, assume anything higher then 100 links on a single page will get you classed as a link farm and avoid doing it.

Selling Your Sites PageRank

Time and time again I come across sites selling there pr7 links or ONLY trading links with certain pr sites. This will cause a ban or penalty as well. Its okay to sell the advertising, or the gain the link, but doing so based on direct advertisement of your page rank is a sure way to get the bad end of the stick from Google.

Doorways

This is similar to cloaking pages. The practice of having one page loaded with your choice keywords that simply re-direct to another more "user friendly" page is also a big issue for Google. My clients get offers all the time from other seo firms offering these kinds of "services". If you get these offers, avoid them at ALL costs.

Same Content on Multiple Domains

Google looks at domain IPs, dates they were registered, etc. Having multiple domains serving the exact same content is a no no as well. This also applies to serving the same content multiple times on separate pages, sub-domains and forwarding multiple domains to the same content.

Conclusion

Many of the above techniques apply to most search engines. By following a mind set that you are building your pages for your human users and not bots, you can insure you will get the most important things from your site: Qualifying links, clicks and a higher ROI. For more information also see "Googles information for webmasters", http://www.google.com/webmasters/.

This article was written by Wil Rushmer, founder of http://www.seohype.com - A resource for internet marketing and website promotion tutorials, including search engine optimisation techniques.
61  THE TECHNO CLUB [ TECHNOWORLDINC.COM ] / SEO / Fresh Content Improves Search Engine Optimization on: August 29, 2007, 02:57:30 PM
Fresh Content Improves Search Engine Optimization


Many search engine optimization companies will sell you a search engine optimization package that addresses many of the major aspects of search engine optimization. These aspects include, but are not limited to, use of file names, alt tags, h1 tags, keyphrase density, meta tag optimization, link analysis and the like. These are all key aspects of a good search optimization.

However, one problem is that the major search engines (especially Google) not only rank pages upon relevant content (which is determined by the factors listed above, and more), but by fresh content as well. What this means to you is that, even after your site has been "optimized to the max", your rankings will increase to a certain level and then not go much higher. To get to the top and stay there, your site should deliver fresh, relevant content on a regular basis. Depending upon the nature of your business, your competition, and targeted keyphrases, the rate at which you should add content to your site can vary from monthly to daily.

The delivery of fresh content to your site, in a form that is readable by search engines (i.e. not through the use of javascript, iframes, or the like) requires a dynamic, database driven content management system.

The most cost effective way to achieve this is through the use of a weblog that sits on your server and resides under your domain name. Updating the weblog with rich articles or commentary, broadcasting this information to the internet, and allowing users to post comments, achieves the following:

1) Increases the number of inbound links to your website

2) Increases the frequency at which major search engines will spider or crawl your site

3) Increases interactivity for the web user

4) Improves your search engine ranking

For further information, you may contact ArteWorks toll free at 877-336-8266, or visit http://www.arteworks.biz.

Matt Foster, CEO, Arteworks Business Class - Search Engine Optimization Expert since 1995

http://www.arteworks.biz

62  THE TECHNO CLUB [ TECHNOWORLDINC.COM ] / SEO / STOP Writing for Search Engines on: August 29, 2007, 02:55:59 PM
STOP Writing for Search Engines


Back when I was starting out with my first internet venture, I did a crazy thing. I subscribed to a Search Engine Optimization newsletter. These guys send a weekly email with their bundle of latest tips. For the first few months, I actually followed what they said. Now, I just keep my subscription to get a few laughs.

You see, their basic advice hasn't changed a bit. It is still about keyword laden content, back links to your site, et cetera, et cetera. Of course, every time the search engines change their algorithm a bit, there is a huge new update on how best to beat this algorithm and remain on top. And you know what? Some of the things these guys say actually work! But here I am, telling you to stop writing for the search engines. I must be really crazy, right?

Not quite. Just like you and probably everybody else, I used to believe that getting the top position on a major search engine would make me rich. It would open floodgates of money, floodgates that I will never be able to close. And to be completely truthful, that is exactly what it did too. Well, actually they were the floodgates of bandwidth bill? but they were floodgates alright!

That is when I realized that I was making the biggest mistake a salesman could make in his lifetime. Instead of creating a website to sell to and satisfy my customers, I had created a traffic portal for the search engines. Unfortunately for me, search engines don't buy things that I want to sell. Rather, they want to sell their own advertisement around my content. After all, what content do Google and Yahoo produce? Zilch! They live of our content and yet, whatever we do, we have the search engines on the top of our head.

So I told myself, to hell with search engines. I will create a website that is for my customers. Sure, I will receive fewer customers, but then I was certain that the guy who is going to sift through the garbage a normal search engine throws up; to come to my site, will be really looking for something specific. And you know what? I will trade such a surfer for a thousand surfers that major websites receive on a daily basis.

And this is exactly what I did. I had two different websites competing for the same keyword. The first website was made just for the search engines (MSN in particular) and was ranked 1 on MSN, 5 on Yahoo and 3 on Google for that particular keyword. On an average day, it received anywhere between 5 to 6 thousand visitors. I made 3 to 5 sales daily from the website.

The second website, on the other hand, made an extensive use of images (I even used an image file for my headline because I didn't like the way various browsers treated text). The day Google indexed my website (Google is always the first to index my sites, god knows why!) I was placed 44th for the term. I received a lowly 40 visitors for my effort and made no sale. For a week, it continued to be so and then Google dropped my website for some reason or the other.

The website was again picked up a week later. This time, Google put it on the 38th position. I received 55 visitors a day for nearly a week and made two sales in the entire week. Now, it was time to push the website a bit. I made a nice little press release, wrote a couple of articles related to the product I was selling and sent them to various article databases and press release sites. A month later, Google had pushed me up to 18th on the search term. I currently rank 15th for that term and receive around 1500 visitors from various search engines for 6 average sales a day. My articles still bring me visitors, around 150 to 200 visitors a day for another couple of sales on an average.

Since that little experiment succeeded for me, I have stopped writing my sales copy to suit the search engines. Rather, I write my sales copy to sell. Even if one person comes to my website, I want him interested and ready to take out his credit card to buy from me. However, visitors are important. So I create content for search engines through articles about the product, press releases and blogs. The whole idea of each of these is to bring the visitor down to my sales site. In addition, once the PR of my site has reached a reasonable number, I start exchanging links with other similar websites. And frankly, that is the kind of search engine optimization you should be working for.

Since all articles must end with advice, here is my advice to you. Separate your sales page from the pages you create for search engines. Use large banners, innovative techniques to get your visitors from these pages to your sales page. Never, ever mix search engine optimization with your sales copy. Sure, for some people, it may reap awards. But for most of us, a page optimized for search engines almost never sells. On the other hand, a page made for the customers actually ends up getting a decent position in the search engines!

Pankaj Saini has just started a new internet publication called Learn and Do It to help fellow internet marketeers to make a better living online. For more of his articles, please visit, http://www.learnanddoit.com/
63  THE TECHNO CLUB [ TECHNOWORLDINC.COM ] / SEO / Do Search Engines Like Your Web Site? on: August 29, 2007, 02:55:06 PM
Do Search Engines Like Your Web Site?


Between 75% and 98.8% of visitors to Web sites come from searches made at search engines. If you're going to get high levels of traffic - and hence the levels of ROI you're looking for - it's very important that the search engines can access all the information on your Web site.

Do the search engines know about all of your pages?

You can find out which pages on your site the search engines know about by using a special search. If you search for 'site:' and your Web site address, the search engine will tell you all of the pages on your Web site it knows about.

For example, search for: site:webpositioningcentre.co.uk in Google. Yahoo or MSN Search, and it will tell you how many pages they know about.

If the search engines haven't found some of the pages on your Web site, it is probably because they are having trouble spidering them. ('Spidering' is when the search engine uses an automated robot to read your Web pages.)

Spiders work by starting off on a page which has been linked to by another Web site, or that has been submitted to the search engine. They then read and follow any links they find on the page, gradually working their way through your whole Web site.

At least, that's the theory.

The problem is, it's easy to confuse the spiders - especially as they are designed to be wary of following certain kinds of link.

Links which confuse spiders

If your links are within a large chunk of JavaScript code, the spider may not be able to find them, and will not be able to follow the links to your other pages.

This can happen if you have 'rollovers' as your navigation - for instance, pictures that change colour or appearance when you hover your mouse pointer over them. The JavaScript code that makes this happen can be convoluted enough for the spiders to ignore it rather than try to find links inside.

If you think your rollovers are blocking your site from being spidered, you will need to talk to your Web designers about changing the code in to a 'clean link' - a standard HTML link, with no extra code around it - that is much easier for the spiders to follow.

Links like these will look something like this:

Home Page

Page addresses to avoid

Spiders will also ignore pages if they don't like the URL (the address needed to find the page).

For example, a Web site that has URLs containing several variables can cause spiders to ignore the page content. You can spot pages like these as they have a ? in them, and &, for instance:

http://webpositioningcentre.co.uk/index.php?page=12&cat=23&jib=c

This URL has three variables, the parts with the = in them, between the ? and &s. We find that if a page has one variable, or even two, the top search engines will spider them without any problems. But if a URL has more than that, often the search engines will not spider them.

Spiders particularly avoid URLs that look like they have 'session IDs' in them. They look something like this:

http://webpositioningcentre.co.uk/index.php?page=12&id=29c8d7r2398jk27897a8

The set of numbers and letters do not make much sense to humans, but some Web sites use them to keep track of who you are, as you click through their Web site.

Spiders will generally avoid URLs with Session IDs in them, so if your Web site has them, you need to talk to the people who developed the site about re-writing it so they do not use these IDs, or at least that you can get around the Web site without them.

Clean links = happy spiders

If you use clean, easy to follow links without several variables in them, your Web site should be spidered without problem. There are, of course, many other facets to successful Search Engine Optimization, but if the search engines can't spider your content, your site will fall at the first hurdle.

Paul Silver and David Rosam are Head of Technical SEO and Head of SEO Copywriting at Web Positioning Centre (http://webpositioningcentre.co.uk). Paul has been involved with the Web commercially since 1996 and David has been writing marketing copy for 20 years, and writing for the Web for a decade.
64  THE TECHNO CLUB [ TECHNOWORLDINC.COM ] / SEO / LSI and Link Popularity on: August 29, 2007, 02:54:32 PM
LSI and Link Popularity


When Paypal's official Web site no longer ranked #1 in Google on a search for "paypal," it was obvious that Google had become more aggressive in penalizing sites with "unnatural" backlink anchor text. Although the high-profile Paypal example has since been rectified, thousands of webmasters are suffering the consequences of not ranking for even their official company name, let alone their top keywords. It is important for search engine optimizers to understand both how anchor text penalties are being applied and how LSI ensures that anchor text variance will not dilute a link popularity building campaign.

Anchor Text Penalties

In the past year, webmasters have found that the aggressive link popularity building tactics that work well in search engines such as Yahoo! do not fare well in Google. Google has implemented several features to filter out sites that appear to have an unnatural backlink structure; one of these features seems to be specifically penalizing sites with unnatural backlink anchor text.

It has always been an SEO best practice to use descriptive anchor text in both external and internal links. But search engine optimizers have often focused on a single keyword phrase when choosing anchor text, especially if their topic has one keyword that receives vastly more traffic than any secondary keywords. Since good links are hard to come by, they do not want to "waste" any of those backlinks with anchor text that does not contain their main keyword.

The drawback to this approach is that it can be interpreted as unnatural by a search engine. A site with organic, passively-obtained backlinks will have a wide variety of backlink anchor text variations such as: "official site title," "keyword," "keyword synonym," "www.thesite.com" and even "click here." If the vast majority of a site's backlink anchor text is simply "keyword," it is obvious to an algorithm that the link popularity was not obtained organically.

Latent Semantic Indexing (LSI) Basics

Let's now touch upon the myth I mentioned before, that if a backlink's anchor text does not contain your Web site's main keyword, its power is wasted. The concept of latent semantic indexing, which may be more fully implemented by major search engines in the near future, will prove this myth to be false.

Latent semantic indexing can help overcome the "vocabulary mismatch" problem when a human uses a search engine. Individual words do not always provide reliable evidence about the conceptual meaning of a document. For instance, a Web page that is highly relevant to the term "laptop" may never use the term "notebook," however it is clear to a human being that "notebook" is often used as a synonym for "laptop."

While it is beyond the scope of this article to discuss the mathematics behind LSI, its implications for search algorithms are simple. LSI can use statistical techniques to create a semantic analysis for any given query topic. In practice, this means that a page can be considered relevant for a particular keyword, even if it does not contain that keyword. For instance, a page that is considered relevant for "laptop" can also be considered relevant for "notebook" even if it does not contain the word "notebook," if LSI determines that "notebook" is semantically related to "laptop."

The principle can be applied to backlinks as well. Backlinks with anchor text that do not contain your Web site's main keyword, but instead contain a synonym or related word, may still be giving your site a bonus for the main keyword.

Link Popularity Building Best Practice: Vary Your Anchor Text

The recent increase in penalties given to sites with unnatural backlink anchor text, along with the possible implementation of LSI, should give webmasters motivation to vary their backlink anchor text heavily. Rather than seeking to only obtain links using their main keyword, webmasters should include synonyms, variations and related words. Certainly no single keyword variation should be used the majority of the time; rather, the text of all links should vary widely, just as they would if the links were obtained passively. This will ensure a site's improvement in the SERPs, without drawing a penalty flag.

Andy Hagans is a search engine optimization consultant who specializes in link popularity building and risk management. Visit http://www.andyhagans.com for more information. See http://www.andyhagans.com/link-building.php for more information on Mr. Hagans' link building services.
65  THE TECHNO CLUB [ TECHNOWORLDINC.COM ] / SEO / The Unethical SEO Myth on: August 29, 2007, 02:53:55 PM
The Unethical SEO Myth


"The use of black hat SEO techniques are completely unethical." Really? I completely disagree.

Is it unethical because these techniques are attempts to give a webmaster an advantage over their competition? If this is the case, stop participating in SEO forums, stop using WordTracker, stop studying or engaging in SEO at all, because any of those things are ways to help get you a leg up on your competition. In fact, if it is your argument that this is unethical because it gives one person an advantage over another, stop building websites because there are plenty of businesses out there that don't have a website. You may be taking advantage of a medium some people don't have access to.

Is it unethical because it violates Google's Webmaster Guidelines (http://www.google.com/webmasters/guidelines.html)? If this is the case, again, abstain from any SEO whatsoever, because those guidelines state 'Another useful test is to ask, "Does this help my users? Would I do this if search engines didn't exist?"' If you are building sites for your users and disregarding search engines, things like "SEO", "keyword density", "SERPs", etc. hold no interest for you. By even considering what effect your actions have on your SERPs you are violating Google's guidelines.

Dictionary.com's primary definition for "unethical" is based on "approved standards". Who sets these standards? Google? Google is a company. Google's primary focus is on generating profits, as any company their size has to be. Do you have any doubt that they are willing to take advantage of nearly any legal opportunity to gain an advantage over Yahoo or any of their other competitors?

Why do people in this industry choose to set their "moral compasses" by a company's guidelines? Google sets these guidelines because it makes it easier for them to produce search results superior to their competitors'; they don't want to have to expend resources on excluding sites that game their algo. I like Google, they have a fantastic product and they are one of those companies that I find easy to admire, but I certainly don't believe they created those guidelines in an attempt to create this utopian internet where everything is "fair"; they didn't and never claimed to.

Most of the world's economies are based on capitalism and free trade. Google has a cash cow and her name is AdSense/AdWords. Webmasters compete against each other with their cold hard cash to see who can be #1. Is it unethical to squash your competition by outbidding them? Even if it was an unemployed guy with two ex-wives, three kids, a mortgage, $5 to his name, and a mammoth drug habit? Nearly everyone in this industry would say, "No it's business."

If you are an SEO professional and you are optimizing sites, is it unethical to use "black hat" techniques on clients' sites? I certainly don't think so, as long as you make very clear to the client what risks are involved beforehand. Now if you don't tell them they will get banned if caught; I would certainly call that unethical. Ammon Johns, known as Black_Knight at his site, the Cre8asite forums at www.cre8asiteforums.com who, in a post, said "To my philosophy, the SEO who refuses to properly inform their clients of all available techniques and the costs risks and benefits is not ethical. Whether that is failure to tell them of risks, or failure to tell them of strengths. It is the clients role to make the choice, and a truly ethical SEO will enable the client to make a fully informed choice, not just one that meets what the search engines say is okay for all."

So is cloaking, creating doorway pages, search engine spamming, or other similar "black hat" techniques illegal? I know of no law that forbids this stuff as long as you aren't violating someone's copyrights by duplicating their text, stealing their bandwidth, etc. I would be the first to label such techniques "unethical", frustrating, illegal, deplorable, name your word.

The remaining unanswered question is, "Who decides what is 'unethical', who sets 'the standard'?" You and I as internet professionals do. Some people choose to draw the "unethical" line just beyond what they do to improve their own SERPs; anything past that is "unethical". In relation to this subject, I draw this line pretty close to what the American government calls "illegal". I would call almost anything done with a primary intent to damage another website/company "unethical"; and most anything done under those circumstances is also illegal.

Having said all that, to my knowledge, I'm not doing anything on any of my (or my clients') sites that any major SE would consider banning me for; I'm certainly aggressive with the "white hat" stuff. I have dabbled with some of the more risky techniques in the past, but I have put way too much work into my sites (the ones that are worth trying to achieve high rankings for, anyway) that I'm just not willing to risk getting them banned; the risk isn't worth it to me. So while I don't participate in any "black hat" (or what I prefer to call "risky") activities, I just don't like the idea of those who do being labeled "unethical".

This is certainly an issue where people who have an opinion are passionate about it, but I invite you to examine your own motives for calling aggressive search engine optimization techniques "unethical".

About The Author

Damian owns a small website development business dedicated to providing affordable websites to small businesses. You can visit his website at www.divergentlines.com or contact him at [email protected]

66  THE TECHNO CLUB [ TECHNOWORLDINC.COM ] / SEO / The Sandbox Effect on: August 29, 2007, 02:53:03 PM
The Sandbox Effect


What once many people thought they had a penalty, is now being called the Sandbox Effect and is causing new web sites not to rank very well in the search results of Google, not even for the least competitive phrases. Meaning that a filter is being placed on new web sites and cannot rank very high for most words or phrases for a certain amount of time.

This does not mean it's punishment for anything the webmaster did with their site, such as using Spam or anything like that. The probation likely don't apply to the web site, but instead to backlinks. After the link stays on the web site for a certain amount of time it will no longer be on probation and fully counted as a backlink to your web site. Even if your site is better than your competitors it will still be in the Sandbox.

There is a lot of debate as to why Google uses the sandbox filter. However, It is believed that this is attempt by Google to discourage web site owners that use SEO Spam techniques to rank high fast and make a quick buck before Google discovers they are using Spam. This is largely because Google uses link popularity so much to rank web sites in it's search engine.

There is no way to know for sure if a new web site is in the Sandbox, but the best way to determine if you are in the Sandbox is if your web site has a good amount of quality backlinks, quality content and a good ranking in other search engines, such as Yahoo!, MSN and Ask Jeeves, but still no where in Google's search engine results for your keywords.

It has been suggested by some that it can take up to 9 months before a new web site can get a high ranking in Google's search engine results and out of the Sandbox, but this does vary from web site to web site. This Also depends on how competitive your new web site's market and keywords are.

There is no known way of speeding up the amount time a new web site is in the Sandbox. Before you fully put up your web site you could get a few backlinks to it before you fully put it up, even though you will still be in the Sandbox this way, you won't waste much time. You could also buy a domain that's already being used, so you never have to be in the Sandbox.

Matt Colyer began as a SEO Specialist in 1997. He founded Superior Webmaster in 2004 as a source of articles and tutorials for Web site owners looking to improve their Web site.
67  THE TECHNO CLUB [ TECHNOWORLDINC.COM ] / SEO / Submit All Of Your Pages And Watch Your Traffic Grow on: August 29, 2007, 02:52:27 PM
Submit All Of Your Pages And Watch Your Traffic Grow


Everyone is looking for "secrets" about how to get more qualified traffic to their web sites. What I'm going to share with you is no secret, however it is not practiced by very many companies or individuals. Many companies and individuals only submit their home page to search engines and directories. You can easily quadruple your traffic in 90 - 120 days by implementing the following procedures.

# Create a unique title for each page of your site that covers a different subject matter. I am astounded when I surf the web and I find large and small companies whose web sites don't have a decent title for their home page, much less titles for any of their content pages. Your title should not be named "home". If you own a web design program or if you have competent web designers, your title can be changed on every page in 10 - 30 minutes. Your title should be short and describe the content on that page. In case you were not aware, the title of each page always appears in the upper left hand corner of the browser used by the person visiting your web site.

# Create meta tags for each page that has a different title. Most people use the same meta tags for each page of their web site. If you have different content on different pages, then you should have different meta tags for those pages too! Meta Tags are the hidden text that programmers use to enable the search engine "spiders" to find your site on the world wide web. If you don't know what meta tags look like or how to write them, you can learn this information by visiting this URL at our web site: http://www.emarketingman.com/searchengine.htm

# Submit your pages to search engines and directories every month. Once you have the proper title and meta tags for the interior pages of your web site, register them once a month. If you want ANY of your pages indexed by search engines, you have to submit them once a month! You can do this manually or you can do it through various submit programs. The most affordable method we have found are the free directory submit tool and free search engine submit tool at JimTools: http://www.jimtools.com

JimTools allows you to register with 56 search engines and over 100 directories at no charge. If you do this once a month, you will start getting traffic to your newly submitted pages within 60 - 120 days.

In summary: Optimizing your title and your Meta Tags for each page can easily quadruple your traffic in just a few months. Submitting your internal pages of your web site to directories and search engines is an affordable and easy way to bring more qualified traffic to your web site.

About The Author

Daryl Clark is the owner of www.EMarketingman.com. A web site designed to teach individuals and organizations how to successfully promote their businesses on line. Visit his web site at http://www.emarketingman.com
68  THE TECHNO CLUB [ TECHNOWORLDINC.COM ] / SEO / Feed Those Hungry WIIFM Monsters, Get Them To Multiply on: August 29, 2007, 02:51:47 PM
Feed Those Hungry WIIFM Monsters, Get Them To Multiply


Everyone packs their website with keywords in order to feed those keyword-hungry search engines spiders.

Just do not starve your potential customers by forgetting to have the food they like:

    * ample helpings of benefits

    * tips rich in protein (practical)

    * automatic weekly feedings (like a newsletter)

If you (affectionately) look upon your potential customers as if they are WIIFM monsters, and understand how to care for and feed them, there will be no scary nightmares (unprofitable websites).

A WIIFM monster is someone that is only interested in: what is in it for them.

It would take days to describe everything you need to know about these monsters. In the end you will see that those monsters are actually cute, cuddly, likeable monsters.

Just by observing these monsters (from a safe distance) you can learn so much ...

The moment one of these monsters arrive on your website, you need to start feeding them - immediately. It is best to have many automatic feeders at your website, those monsters are hungry - they must have food, NOW.

An automatic feeder used often is an autoresponder feeder. These monsters can pick what auto-feeder they want to feed from, and start receiving food in minutes. These automatic feeders feed the monster for days, one easy-to-digest helping per day. The monster has the option to at any time stop the feeder.

I MUST warn you about something here. If you do not have a facility for the monster to stop the feeding, the monster will explode. This is NEVER a pretty sight. Humans get their websites taken away, their email privileges revoked and so on.

Another food source is eBooks. They can chew on this for days, before they come back for more. The tastier (valuable, practical content) your ebooks, the better the chance that they will come back to YOU for more. Always have a footpath (link) in the eBook that the monster can find you again in the big jungle where everyone shouts to them: feed here for free, FREE: feed here ...

There are many different types of monsters that every website must have food for, but for now I will just tell you about the WIIFM monster.

This is the biggest monster of all. If it is not fed well, feeding all the other little monsters is of no use (like the freebie monsters).

The what-is-in-it-for-me monster is also sometimes 'affectionately' called the WIIFM monster. It only wants to know what it can get from a website.

If it is clear to this monster that you have no tasty food, it leaves immediately.

This monster has a pair of very well developed eyes. It can see within 3 seconds if a website has its favourite food (WIIFM content).

These eyes are so well developed that it totally ignores banners. These eyes only go for the real tasty food (quality, practical content).

The monster also has a very, very fast click-claw. If those well-developed eyes do not see food in 3 seconds, that FAST click-claw reacts with a speed that make the speed of light seem like a snail crawl ... off goes the monster to find other more worthwhile feeding grounds.

The more favourite, favourite food you give for the WIIFM monster, the happier it gets. This causes this monster to quickly tell all its other monster buddies about this amazing source of WIIFM food. Yours.

These might be monsters, but they are not too stupid. They know that no matter how much of this WIIFM food they eat, their buddies can eat this food too, without the food ever being eaten up.

The food is not actually eaten, it is food for thought!

So, the more your feed these monsters, the friendlier they become. So friendly that they start sending you email and soon after that they start sending you money.

(Things sure have progressed tremendously since those dark ages - monsters can now send email too. That is how that spam-monster was born, but believe me, you do not want to hear THAT story ...)

To attract more of these monsters, you build a monster-habitat for them.

What they really like in such a habitat is things like ...

    * a WIIFM discussion

    * weekly fresh WIIFM food at the website

    * a weekly newsletter, also with fresh content

    * links to other worthwhile food sources

No matter to how many other WIIFM websites you send them to, they will always come back to you - if you really care about them and want to feed them.

You become a sort of a leader of the pack for them. You lead them to worthwhile food sources.

These monsters are very willing to buy food from you. If this food is tasty, they will also tell their monster buddies too. These buddies will also buy this tasty food. You will be amazed at how quickly this good news can spread. For some reason, these monsters are efficient at telling everyone very quickly if you tricked them, so be fair in all your dealings with them.

There are so many stories I want to tell you about these monsters, their different types of food sources, their enemy monsters, how to become their friends, and so on, but my time is up (for now). Time waits for no monster, not even the WIIFM monster ...

To sum up the story so far, let us listen to one of the monsters I like best:

Sumarus - he likes to eat lots of tasty food (value content) and then summarize it, oh so brilliantly ...

Sumarus (after clearing his throat):

So, to summarize (he always starts just like that)

The sole purpose of a website is to be the feeding ground of WIIFM monsters. The more WIIFM food you dish up in the form of articles, ebooks, newsletters and links to other worthwhile food sources, the more WIIFM monsters you can attract.

Your primary role is in providing tasty food to attract as many WIIFM monsters as you possibly can. The tastier your food, the more you can ask them to pay for your commercial food. If WIIFM monsters see you have no food, they leave - immediately, sometimes sooner !

WIIFM monsters HATE vomited food. They LOVE fresh food you prepared just for them.

If WIIFM monsters and website owners get to know each other very well, this can lead to lifelong, beneficial partnerships for everyone involved.

Sumarus might be too theoretical and philosophical for most humans, so this is what I think he means:

The better you can cater for the daily, practical needs of your website visitors, the better the chance that they will stay with you and continue being a newsletter subscriber.

You should only recommend other worthwhile websites.

Even if there are lots of sources of free worthwhile content, you must add value to what you give your website visitors and newsletter readers.

Initially you must provide value content up front, but will benefit from this later in the form of sales.

Sumarus also suggests that you reread this article to see how many other not-so-obvious tips you can find hidden in it.

About The Author

Article by Alwyn Botha of http://www.leveragedsuccess.com. Leveraged Internet Success website contains: Discussions, eBooks, articles, a weekly newsletter, and email courses. Leveraged = your maximum, exponential Internet success
69  THE TECHNO CLUB [ TECHNOWORLDINC.COM ] / SEO / Yahoo Listing Still Worth It? on: August 29, 2007, 02:51:06 PM
Yahoo Listing Still Worth It?


In October 2002, the Yahoo! portal changed the way it delivers search results. In the past, the most prominent results were exclusively culled from websites listed in the Yahoo directory itself. Since October, sites listed in the Yahoo directory no longer enjoy this privileged status.

The Google search engine now drives the primary search results on Yahoo. While this is certainly an improvement for users of Yahoo search, it's a disaster for many businesses that counted on their Yahoo listing to deliver substantial traffic.

This change has also led many site owners to question the value of a listing in the Yahoo directory. In this article, I will outline the pros and cons of maintaining, or paying for, a Yahoo listing. In the process, I will delve into more details of the recent changes.

Argument #1: Yahoo Listings Mean Link Popularity

Pro: Even if the Yahoo listing itself delivers little or no traffic, other search engines will rank your website higher if it's listed in Yahoo. Because Yahoo is so important, a link from Yahoo counts more than a regular link. Thanks to its higher "PageRank," Yahoo means even more to Google.

Con: Yahoo listings do not deliver nearly as significant a contribution in this area as you might think. You can verify this by doing a "backward links" search on Google for any Yahoo-listed website. The most important links are listed first, and the Yahoo listing is rarely even on the first page of links for top ranked sites on Google.

Argument #2: Listed Sites Look Better In The Search Results

Pro: Websites with a Yahoo listing show up in the combined Yahoo/Google results with their title, description, and category from the Yahoo directory. This may boost the response when the site appears in the search results. This applies when the URL listed in the results is the same as the URL in the Yahoo listing.

Con: Results listed with Yahoo information include a link to the site's category, which may prompt surfers to pass over your listing and go to the category. Sites without Yahoo listings have the more inviting "search within this site" link, which leads to more results exclusively from your site.

So, Is A Yahoo Listing Worth It?

If you have a non-commercial site and can get listed for free, of course! If you're not one of the lucky few, though, you have to evaluate whether it's worth $299 a year for what amounts to a better than average incoming link. Everyone must make their own decision. If $299 is small compared to your total marketing budget, it may be easier to just continue paying. My own listing expires in March, and I don't intend to renew it.

How Can You Profit From The Changes At Yahoo?

The obvious answer is that you must take steps to improve your own position in Google's search results. Google's rankings are made up of many factors, but the dominant factor is "PageRank," which is based on the number and quality of incoming links from other websites.

Therefore, the first step in improving your position on the Google search engine (and now Yahoo) is to improve your site's link popularity. This takes time, and trying to take shortcuts can get you into real trouble - Google doesn't like "link farms," or any program designed to artificially boost your link popularity.

Finding Quality Link Partners Through Google

Since only links from quality sites will count for much with Google, let's take a quick look at how you can find these sites. Start by targeting the sites that link to existing top-ranked sites. You can do a backward links search for any site by typing "link:http://www.domain.com" in the Google search engine.

An even faster method is to use the Google toolbar (http://toolbar.google.com/), which requires Internet Explorer 5 or greater, running on Windows. With the toolbar's advanced features enabled, you can conduct a "backward links" search from the "Page Info" menu for any site you visit.

Since Google lists these results in descending order by "PageRank," you can quickly determine the best places to get links by doing backward links searches on the top 10-20 sites for your desired search terms, and seeking links from the top 10-20 places that link to them.

Links Are Not Enough: Optimizing For Google

While "PageRank" is the dominant factor in Google's algorithm, it's not the only factor, and you still need to optimize your web pages. This can be a complicated topic, but the most important factors are:

    * Keywords in the title of the page

    * Keywords in headings on the page (H1 or H2 tags), especially the first heading.

    * Keywords in the body text of the page, particularly the first paragraph.

Don't Complain, Act!

By some estimates, Google now controls 2/3rds of the searches conducted on the Internet in a given day. Not only is Google.com extremely popular in its own right, but Google also controls the search results on popular portals like AOL and Iwon.com - not to mention Yahoo.

A lot of website owners are complaining bitterly about this change. All the more reason for you to take action now, while so many of your competitors are busy licking their wounds. With a little planning and effort, you could be in a dominant position on Google before they even get started.

I wish you success...

About The Author

Dan Thies is the author of "Search Engine Optimization Fast Start," the ultimate beginner's guide to higher search engine rankings - available today at http://www.cannedbooks.com
70  THE TECHNO CLUB [ TECHNOWORLDINC.COM ] / SEO / Search Engine Musical Chairs on: August 29, 2007, 02:50:23 PM
Search Engine Musical Chairs


News broke this week that Yahoo has purchased the Inktomi search engine for around US$235 million. This is an interesting development in the search engine industry that may impact greatly on exactly where sites get their traffic from. Another new development in the past few weeks is the change to the HotBot service but first, I'd like to look at the ramifications of the Yahoo/Inktomi deal.

Here's a bit of background information on how the Yahoo and Inktomi search engines work. I'm sure everyone knows Yahoo but not everyone may be completely aware of how their search engine works. Yahoo is a directory. This means that it is a categorised list of sites that are listed by human editors. Getting listed in the Yahoo directory requires a yearly payment for commercial sites. You can get listed for free if you have a non-commercial site but it is very hard to do and can take months. Yahoo provides additional results when you use their search function that come from Google. In the past, Yahoo has used other search engines to provide these results (including Inktomi) but has used Google for the last couple of years.

They initially used results from other search engines to just compliment their own. So, if someone searched for something that wasn't in their directory, they could still get results. The way it worked was that they would provide results from their own directory first and then a user would click on "web pages" to get more results. Strangely, Yahoo actually changed the way this worked during the year so that the results from a search were a mix of listings from their own directory and Google with the Google results being the more prominent of the two. This has essentially made the payment of the US$299 per year for a directory listing an unnecessary expense. Many people still use the directory to browse for sites but, in my experience, more people use the search function.

Inktomi is a full crawler based search engine that provides results for other search engines. They do have a web site but no-one uses it to actually search - it's more of an informational site. Inktomi's results formerly appeared in many search engines but in the last few years, their popularity has declined. The main site that currently uses Inktomi is MSN but Inktomi only provides the fifth level of results. MSN provides sponsored listings from Overture, their own Microsoft sites, their own human edited directory results, LookSmart listings and then Inktomi. So, Inktomi does drive traffic to sites but far less than a search engine like Google.

So, what does all this mean to us? It's hard to say at the moment but there are three options. Either nothing will change (which is unlikely), Yahoo will use Inktomi together with Google on it's site or it will dump Google and use Inktomi for it's search results. Yahoo actually owns part of Google, maybe 5%, so there is a chance that they will keep using their results for this reason. Also, Google provide far better results for searchers than Inktomi so Yahoo would be making a mistake to replace Google. But, apparently Yahoo is annoyed that Google has become competition to them through Google's own site and that Google's news search service is also providing competition.

Without a crystal ball, it's very difficult to know what is going to happen but it's worth making sure that you are prepared for all occurrences. There is really very little that you need to do because it is only the popularity of Inktomi that may change. Even without the searches from Yahoo, Google will still be the biggest search engine in the world due to the fact that it's own site is very popular and it's results are also used by AOL (amongst others). So, what can you do to get the most from Inktomi?

Inktomi is the only major search engine that uses meta tags. This is actually the reason that it is not as popular as before because meta tags allow webmasters to "trick" Inktomi into thinking a page has more relevant content than there really is. Therefore, Inktomi's search results are not very good. However, as there is a good chance that Inktomi will now become popular again, you need to make sure that you have your meta keyword and meta description tags in place. Make sure they are relevant to the page that they are on - you don't want to be caught "spamming" their search engine. The other thing that will probably make a difference is to get other sites to link to you. Link popularity makes a difference on all search engines except possibly AllTheWeb.

I'll keep you updated here with any news on what Yahoo decides to do and any changes that I find out about the way the Inktomi generates results.

As I mentioned above, the other major search engine news in the last month is that HotBot has finally updated their site. HotBot is owned by Lycos and was quite a popular search engine a few years ago. It has become far less popular lately - since it was purchased by Lycos and was just left to "die". It's results largely came from a mix of the ODP, DirectHit and Inktomi. DirectHit has since become Teoma which is owned by AskJeeves but HotBot went for months after DirectHit ceased to exist saying that it's results were coming from them.

So, HotBot has essentially become another meta search engine - like DogPile or Excite. It now works pretty much like Netscape does. A searcher can choose which search engine they would like their results to come from. The search engines that you can choose from are Fast (AllTheWeb), Google, Inktomi and Teoma. This may mean that HotBot starts to get some of it's market share back but as webmasters, there is nothing that we can do to target HotBot. All we can do is target the search engines which provide it's results.

About The Author

Sean Burns is the author of the WebmastersReference.com Newsletter - http://www.webmastersreference.com/newsletter. More than five years of experience in site design, marketing, income generation, search engine optimisation and more is passed on to subscribers - hype free. Sign up today to get real information of real value to webmasters.

[email protected]
Pages: 1 2 3 4 [5] 6 7 8 9 10 ... 1109
Copyright © 2006-2023 TechnoWorldInc.com. All Rights Reserved. Privacy Policy | Disclaimer
Page created in 0.19 seconds with 23 queries.