Wednesday, February 13, 2008

Why Spider have difficulty to Crawl a Web Page

Certain types of navigation may block or entirely prevent search engines spider from reaching your website's content. As search engine spiders crawl the web, they rely on the architecture of hyperlinks to find new documents and revisit those that may have changed. In the analogy of speed bumps and walls, complex links and deep site structures with little unique content may serve as "bumps." Data that cannot be accessed by spiderable links qualify as "walls."

Possible "Speed Bumps" for SE Spiders:

URLs with 2+ dynamic parameters; i.e. http://www.url.com/page.php?id=4&CK=34rr&User=%Tom% (spiders may be reluctant to crawl complex URLs like this because they often result in errors with non-human visitors)
Pages with more than 100 unique links to other pages on the site (spiders may not follow each one)
Pages buried more than 3 clicks/links from the home page of a website (unless there are many other external links pointing to the site, spiders will often ignore deep pages)
Pages requiring a "Session ID" or Cookie to enable navigation (spiders may not be able to retain these elements as a browser user can)
Pages that are split into "frames" can hinder crawling and cause confusion about which pages to rank in the results.

How Our Web Page Appear in Search Engines

When you searching for information in the internet, you certain type a query or keyword frase in search engine, and search engine will provide search result for that query or keyword frase.

So.. how can a Web Page Appear in Search Engines ?
Here the process:

1. Crawling a Web Page
Search engines run an automated programs, called "bots" or "spiders", that use the hyperlink structure of the web to "crawl" the pages and documents that make up the World Wide Web. Estimates are that of the approximately 20 billion existing pages, search engines have crawled between 8 and 10 billion.

2. Indexing Documents
When a page has been crawled, the contents will be "indexed" and stored in a giant database of documents that makes up a search engine's "index". This index needs to be tightly managed so that requests which must search and sort billions of documents can be completed in fractions of a second.

3. Processing Queries
When a request for information comes into the search engine (hundreds of millions do each day), the engine retrieves from its index all the document that match the query. A match is determined if the terms or phrase is found on the page in the manner specified by the user.

For example, a search for fish and fishing magazine at Google returns 8.25 million results, but a search for the same phrase in quotes ("fish and fishing magazine") returns only 166 thousand results. In the first system, commonly called "Findall" mode, Google returned all documents which had the terms "fish", "fishing", and "magazine" (they ignore the term "and" because it's not useful to narrowing the results), while in the second search, only those pages with the exact phrase "fish and fishing magazine" were returned. Other advanced operators (Google has a list of 11) can change which results a search engine will consider a match for a given query.

4. Ranking Results
Once the search engine has determined which results are a match for the query, the engine's algorithm (a mathematical equation commonly used for sorting) runs calculations on each of the results to determine which is most relevant to the given query. They sort these on the results pages in order from most relevant to least so that users can make a choice about which to select.

Although search engine operations are not particularly lengthy, systems like Google, Yahoo!, AskJeeves, and MSN are among the most complex, processing-intensive computers in the world, managing millions of calculations each second and funneling demands for information to an enormous group of users.

Monday, December 24, 2007

After Website Done

After your website was done, you need Search Engine Optimization Firm,

Tuesday, December 18, 2007

SEO Technic Is Change

SEO technic every year is change, so we should follow the changes :)

Wednesday, October 24, 2007

About Off Page and On Page Optimization

On page search engine optimization is basically the search engine optimization techniques that specifically apply to the webpage which you are attempting to get ranked highly in the search engines for your specified search keyword. Off page search engine optimization, on the other hand, is the search engine optimization techniques that are applied to "Off page", that is on other people's websites. Basically off page search engine optimization involves managing the inbound linking structure of the inbound links to your website. The bottom line is that off page search engine optimization involves structuring your inbound links in such a way that the search engines recognize your website as having more importance than other websites.

Importance of On-Page Optimization.

So how important is on page search engine optimization? On page search engine optimization is all-important, and not very important all at the same time. Basically on page search engine optimization alerts the search engines to the topics of your webpage, which is of course absolutely necessary for the search engines to assign any type of ranking to your website or webpage for your topic, or keyword. However, on page optimization does very little, if anything, to increase your actual rankings in the search engines. So while on page optimization does very little if anything to increase your ranking in the search engines, without it the search engines would be hardly able to rank your webpage for any keyword.

On-Page Optimization Techniques.

So what are the on page search engine optimization techniques? In short, the on page search engine optimization techniques revolve around the idea of using and strategically placing your keyword or keyword phrase in various specific locations on your webpage and your webpage code.

Do you want to learn more about how I do it? I have just completed my brand new guide to article marketing success, ‘Your Article Writing and Promotion Guide‘

Download it free here: Secrets of Article Promotion

Do you want to learn how to build a massive list fast? Click here: Email List Building

Sean Mize is a full time internet marketer who has written over 900 articles in print and 9 published ebooks.

Saturday, October 20, 2007

Open Directory Project (ODP) Listing

Http://dmoz.org The Open Directory Project (ODP), is the most important taxonomic directory on the Web. The ODP is run along the lines of an open source project with the belief that "humans do it better.

The ODP believes that web automated search is ineffective, and that the small contingent of paid editors at commercial web search engine companies cannot keep up with the staggering rate of change on the Webdecaying stagnant sites, link rot, new sites, sites intended as search spam.

Editors of dmoz.org follow internal checks and balances to keep the integrity of the directory. See http://dmoz.org/guidelines/ for more information about the ODP review process and guidelines for site inclusion.

The ODP is run and maintained by a vast army of volunteer editors, You, too, can become an ODP editor in an area of your interest and expertise. See http://dmoz.org/help/become.html for more information about becoming an ODP editor. One of the most effective ways to use SEO to promote your sites is to follow the patterns and practices of the ODP to get your sites included.

Google, and most of the major search engines,use information derived from the ODP, but each major search engine uses it in their own way. With Google in particular, information from the ODP is used to form the Google Directory, http://directory.google.com.

If you want the best way to get indexed by search engines, including Googleand, to a significant extent, to manage how your site is categorized. You shoould submit your site to the ODP, it's worth.

Taxonomies of Directory

Taxonomies are categorized directories of information. Directory uses a structured way to categorize sites, thats why Directory differs from the index used by a search engine. An often-used analogy is that a directory is like the table of contents in a book. In other words, the directory tells you how the book is organized, while the index allows you to search within the book for specific keywords.

By working with the two most important structured directories: the Open Directory Project (ODP) and the Yahoo! Directory.
It is not so well kept secret that one of the best approaches for getting good placements in the search engine listings is to enter through a back door, by listed in the 2 directory that I've mentioned above.

Wednesday, September 5, 2007

Submission Tools

Maybe you also want to use an automated site submission that submits your site to multiple search engines in one click button.

Before using a site submission tool, you should prepare a short list of keywords and a one or two sentence summary of your site. Alternatively, you can use the keywords and description used as meta information for your site.

If you Google a phrase like "Search Engine Submit" you'll find many free services that submit your site to a group of search sites. Typically, these free submission sites try to up-sell or cross-sell you on a product or service, but since you don't have to buy anything, why not take advantage of the free service? The 2 best examples of this kind of site are:

1. NetMechanic, http://www.netmechanic.com
2. Submit Express, http://www.submitexpress.com
3. Exactseek http://new.exactseek.com
4. Freewebsubmission http://www.freewebsubmission.com

Getting Listed in Search Engines

If your page or site has inbound links from sites in a search index, then Google or any other search engine, will most likely find you quickly. However, it's peculiar but true: different search engines index different portions of the Web.

To avoid being left out, it makes sense to manually add your URLs to search engines. In early times there might be more of a delay before your sites were found, and it really made sense to make sure you were listed.

Add manually your URL:
1. Yahoo! go to http://submit.search.yahoo.com/free/request
2. Google, go to http://www.google.com/addurl/

Getting listed is, of course, it's only the beginning. One of the goals of core SEO is to get highly ranked and listed in reference to some specific search terms. Achieving this goal requires taking affirmative control of the information that the search index may use to make ranking decisionsand that it will use to understand the quality of your pages.

4 Google Basic as Search Engine

Search engines, such as Yahoo or Google, are highly complex implementations of software technology that have evolved into businesses, and certainly Google provide access to the information you can find on the Internet.

A search engine, such as Google, implements 4 basic:

1. Discovery, mean finding web sites. This is accomplished using software that travels down web links, which is sometimes called a bot, robot or webbot.

2. Ranking, used to order stored page by how important they are. Google use a complex mechanism, such as PageRank to accomplish this task.

3. Storage of links, related information, and page summarie. Google calls the systems used for this purpose its index.

4. Return of results, used to organize the display of search result, based on ranking.

Discovery, Ranking, Storage, and Return are all important to SEO.