Username: Save?
Password:
Home Forum Links Search Login Register*
    News: Keep The TechnoWorldInc.com Community Clean: Read Guidelines Here.
Recent Updates
[May 17, 2024, 05:02:16 PM]

[May 17, 2024, 05:02:16 PM]

[May 17, 2024, 05:02:16 PM]

[May 17, 2024, 05:02:16 PM]

[April 24, 2024, 11:48:22 AM]

[April 24, 2024, 11:48:22 AM]

[April 24, 2024, 11:48:22 AM]

[April 24, 2024, 11:48:22 AM]

[April 03, 2024, 06:11:00 PM]

[April 03, 2024, 06:11:00 PM]

[April 03, 2024, 06:11:00 PM]

[April 03, 2024, 06:11:00 PM]

[March 06, 2024, 02:45:27 PM]
Subscriptions
Get Latest Tech Updates For Free!
Resources
   Travelikers
   Funistan
   PrettyGalz
   Techlap
   FreeThemes
   Videsta
   Glamistan
   BachatMela
   GlamGalz
   Techzug
   Vidsage
   Funzug
   WorldHostInc
   Funfani
   FilmyMama
   Uploaded.Tech
   MegaPixelShop
   Netens
   Funotic
   FreeJobsInc
   FilesPark
Participate in the fastest growing Technical Encyclopedia! This website is 100% Free. Please register or login using the login box above if you have already registered. You will need to be logged in to reply, make new topics and to access all the areas. Registration is free! Click Here To Register.
+ Techno World Inc - The Best Technical Encyclopedia Online! » Forum » THE TECHNO CLUB [ TECHNOWORLDINC.COM ] » Techno Articles » Website Promotion » SEO
 Lucky Thirteen: An SEO Spot Check
Pages: [1]   Go Down
  Print  
Author Topic: Lucky Thirteen: An SEO Spot Check  (Read 2105 times)
Mark David
Administrator
Super Elite Member
*****



Karma: 185
Offline Offline

Posts: 1624

!!!Techno King!!!

fabulous_designer
View Profile WWW
Lucky Thirteen: An SEO Spot Check
« Posted: September 15, 2007, 12:45:24 AM »


Lucky Thirteen: An SEO Spot Check

We can't all be SEO experts but here is a list of 13 ways to give your SEO a boost from the experts.

1) Search Engine Crawl Error Pages - It's important to check search engine crawl errors reports to stay on top of your site and its pages performance. Checking the error reports can help you determine when and where Googlebot or other crawler is having trouble indexing your content - which can help you find a solution to the problem.

2) Create/update robots.txt and sitemap files - These files are supported by major search engines and are very useful tools for ensuring that crawlers index your site content while avoiding those sections/files that you feel are unimportant or cause problems in the crawl process. In some cases you will see the proper use of these files make all a big difference between total crawl failure and a full index of the content pages which makes them crucial from an SEO standpoint.

3) Check Googlebot activity reports - These reports tell you how long it's taking Googlebot to access your pages. If it is taking search engine crawlers a long time to index your pages it may be that there are times when they "time out"and stop trying. If the crawlers are unable to call your pages up quickly there is a good chance that users are experiencing the same lag in load times.

4) Check how your site looks to browsers without image and JavaScript support - One of the best ways to determine just what your site looks like to a search engine crawler is to view your pages in a browser with image and JavaScript support disabled. Mozilla's Firefox browser has a plug-in available called the "Web Developer Toolbar" that adds this functionality and a lot more to the popular standards-compliant browser. If after turning off image and JavaScript support you aren't able to make sense of your pages at all it is a good sign that your site is not well-optimized for search.

5) Make sure that all navigation is in HTML, not images - A common mistake in web design is to use images for site navigation. For some companies SEO is not a concern, for anyone worried about having well-optimized pages this should be the first thing to go. Not only will it render your site navigation basically valueless for search engine crawlers, but within reason very similar effects can usually be achieved with CSS roll-overs that maintain the aesthetic impact while still providing valuable and relevant link text to search engines.

6) Check that all images include ALT text - This is another place to optimize your pages. Search engines can only see your ALT text, if you've provided it.

7) Use Flash content sparingly - From an SEO standpoint Flash files might as well be spacer GIFs - they're empty. Search engines are not able to index text/content within a Flash file. For this reason, while Flash can do a lot for presentation, from an accessibility and SEO standpoint it should be used very sparingly and only on non-crucial content.

Cool Make sure each page has a unique < title > and meta description tag - Optimization of < title > tags is one of the most important on-page SEO points. Many webmasters are apparently unaware and use either duplicate < title > tags for multiple pages or do not target search traffic at all within this valuable tag. Run a search on a competitive keyword of your choice on Google - click on the first few links that show up and see what text appears in the title bar for the window. You should see right away that this is a key place to include target keywords for your pages.

9) Make sure that important page elements are HTML - Crawlers are basically only looking at your source code. Anything you've put together in a Flash movie, an image or any other multimedia component is likely to be invisible to search engines. With that in mind it should be clear that the most important elements of your page, where the heart of your content will lie, should be presented in clean, standards-compliant and optimized HTML source code.

10) Be sure to target keywords in your page content - Some webmasters publish their pages in hopes that they will rank well for competitive keywords within their topic or niche. This will simply never happen unless you include your target keywords in the page content. This means creating well-optimized content that mentions these keywords frequently without triggering spam filters.

11) Don't use frames - There is still some debate as to whether frames are absolutely horrible for SEO or whether they are simply just not the best choice. Either way, you probably don't want to use frames. Crawlers can have trouble getting through to your content and effectively indexing individual pages, for one thing. For another, most functionalities that the use of frames allows is easily duplicated using proper CSS coding.

12) Make sure that your server is returning a 404 error code for unfound pages - While broken links that point to these pages should definitely be avoided you also don't want to create a "custom error page" to replace this page. Why? If you generate a custom error page then crawlers can spend time following broken links that they won't know are broken. A 404 error page is easily recognizable, and search engine crawlers are programmed to stop following links that generate this page.

13) Make sure crawlers will not fall into infinite loops - Many webmasters use scripting languages, such as PERL, PHP and ASP to add interactive functionality to their web pages. However, what some webmasters don't realize is that unless they use robots.txt files or take other preventative measures search engine crawlers can fall into what are called "infinite loops" in their pages. They are built to recognize when they've run into an "infinite loop" situation like this, and they will simply stop indexing pages at a site that is flagged for this error.

Arthur Browning
# Web Templates Blog
# Design Interviews
# Template Monster Blog

Arthur Browning began his career teaching technical writing in a small midwestern university for 15 years. He later editted and published a national professional journal for some ten years. He is now an investor. His interests include art collecting, web marketing, writing.

Logged

Pages: [1]   Go Up
  Print  
 
Jump to:  

Copyright © 2006-2023 TechnoWorldInc.com. All Rights Reserved. Privacy Policy | Disclaimer
Page created in 0.208 seconds with 26 queries.