| 0 comments ]

Use of Site Map

Usually saved as sitemap.html or sitemap.php ...etcThis file will be having the list of all possible or most important links of Your website and the URL in it should be in SEO friendly mode.This will help in links traversal of search engineswhich finally result in indexing.

Link Exchange or Web Ring

A webring in general is a collection of websites from around the Internet joined together in a circular structure. When used to improve search engine rankings, webrings can be considered a
search engine optimization technique. To be a part of the webring, each site has a common

navigation bar; it contains links to the previous and next site. By clicking next (or previous) repeatedly, the surfer will eventually reach the site they started at; this is the origin of the term webring. However, the click-through route around the ring is usually supplemented by a central site with links to all member-sites; this prevents the ring from breaking completely if a member site goes offline.

A link exchange (also known as a banner exchange) is a confederation of websites that operates similarly to a web ring.Webmasters register their web sites with a central organization, that runs the exchange, and in turn receive from the exchange HTML code which they insert into their web pages. In contrast to a web ring, where the HTML code simply comprises simple circular ring navigation hyperlinks, in a link exchange the HTML code causes the display of banner advertisements, for the sites of other members of the exchange, on the member web sites, and webmasters have to create such banner advertisements for their own web sites.

Link exchanges have advantages and disadvantages from the point of view of those using the World Wide Web for marketing. On the one hand, they have the advantages of bringing in a highly targeted readership (for link exchanges where all members of the exchange have similar web sites), of increasing the "link popularity" of a site with Web search engines, and of being relatively stable methods of hyperlinking. On the other hand, they have the disadvantages of potentially distracting
visitors away to other sites before they have fully explored the site that the original link was on.

PAID INCLUSION PROGRAMS

Paid Submission

Sites pay a certain amount to have a directory editor look at their pages and evaluate them within a certain time period. Since the backlog at some directories is 4-5 months or more, this can be a good deal. However, the directory doesn't guarantee that your site will get listed - only evaluated.

Paid Placement

Used first by GoTo (now called Overture). Sites can purchase either a top rank or prominent listing for particular search terms. The listing may or may not be identified as a paid advertisement. Some search engines and directories charge a flat fee while GoTo/Overture uses a pay per click system.

Paid Inclusion

Search engines guarantee to list pages from your Web site in their database and re-spider them on a regular basis, usually at least once per week. Unlike paid placement, you aren't guaranteed a particular place in the search rankings.

| 0 comments ]

Handling Dynamic ഉരല്

One of the major reasons for using a server-side language such as PHP is for the ability to generate dynamic content. Often this will lead to single scripts that produce their content based on the input parameters (that is, the variables in the URL).

SEO friendly URLs

The SEO friendly URLs are highly encouraged for using in Dynamic Sites Navigation.
Some facts which helps in pretty good indexing of pages are:

1.No of Pages
2. Access Frequency of Pages
3. Meaningful(SEO Friendly) URLs for accessing a page
4. Link Exchange

So Lets think about the appearence of a non SEO friendly URL it may look ലൈക്
Based on the supplied URL parameters 'sec' and 'id' the respective item is fetched and shown in the browser.Here the index page is manupulated for the variant display of items.Search engines are less efficient in indexing Dynamic URLs .Inorder to over come such a situation we can do some alterations in this URL and make it more meaningful for Searchengines as well as users of the website.
I will give a suggestion for the above URL like the one below
http://culblog.com/index/technical/2/sectechniques/5/
By this technique
You can use a Static like Dynamic URL which is more meaningful to User and the search engine. Hide the technology or language you have used for creating the website.
Easily Indexibile for Search Engines. Gives a feel to Web crawlers that the site have lots of pages under lots of virtual folders. Makes Easily trackable URLs.(You can easily track from the list of URLs popdown from the addressbar of Browser). How to generate SEO friendly URLs :This is not much complex step.Its like generating the synamic URLs but some more meaning ful fields added in the URL.

A sample script for creating the dynamic URL
$row['SectionName'] "); } ?>

This will generate links liks
"Personal
"Technical
"Spiritual
"SEO Techniques
Some additional work can be done on this to make this a SEO friendly URL
We can change the above php code like this
$row['SectionName'] "); } ?>

This will generate links liks
"Personal
"Technical
"Spiritual
"SEO Techniques

If you check the fourth URL you can find a space between 'SEO' and 'Techniques'
in such cases you can replace the space with an hyphen (-).
print("$row['SectionName'] ");
and the output will be like
SEO Techniques
Avoid use of '_' instead of '-'.
Next question is how to access this URL or how Browser identify this SEO friendly URLs
This magic is done by the HTACCESS file which is placed in root of your web directory.

.HTACCESS file and URL rewrite rule


.HTACCESSFILE : .htaccess is the default name for a file that is used to indicate

who can or cannot access the contents of a specific file directory from the Internet or an
intranet. The .htaccess file is a configuration file that resides in a directory and
indicates which users or groups of users can be allowed access to the files
contained in that directory.It may also contain some rewrite rules which say
the server to access a page in it if the URL on the browser address bar satisfies
any of the rewrite rules on .htaccess file. A sample .htaccess file
all requests to whatever.htm will be sent to whatever.php:
DirectoryIndex index.php

order allow,deny
deny from all

Options +FollowSymlinks
RewriteEngine on
RewriteRule ^(.*)\.htm$ $1.php [NC]

Lets rewrite our Example URLs the .htaccess file content will be like this
Options +FollowSymlinks
RewriteEngine on
RewriteRule ^(.*)\.htm$ $1.php [NC]
RewriteRule ^topics/(.+)/([0-9]+)/(.+)/([0-9]+)/$ http://%{SERVER_NAME}/index.php?sec=$2&id=$4 [nc]

Here the SEO URL
http://www.culblog.com/topics/seo-techniques/4/generate-seo-friendly-url/23
will be rewrited to
http://www.culblog.com/index.php?sec=4&id=23
http://culblog.com/index.php?sec=2&id=5

| 0 comments ]

Search engine optimisation tips

SEO Techniques

Well what you need to do is to try and get your web site listed as high as

possible in the results and there are a number of things you can do to help.
As 95% people using search engines only look at the top 20 search results,
it is important to get your web site as high as possible.

SEO factors to be considered:

1
. Page Title
2. Meta tags
3
. Page Content
4. Hyper Links
5. Images
6. Links to the Site- URL Exchange
7
. Site Using Frames
Page Title Make sure each page on your web site has a title in the head of the document.Many search engines use the title of the web page as the link to your site,so it is important that your title be as relevant and descriptive as possible. Example : Home page of Jesus youth India Meta tags Meta tags are one of the most important parts of a web page when it is indexed by a search engine.The description meta tag is very often used by a search engine to display a sort description of what your web page contains. Some search engines look at the keywords entered into the keywords meta tag and will use this to display your page in the search results if it is one of the words used in the search criteria.If you want your web site indexed then consider using Meta tags.Meta tags go in the head of your web page, in-between the HTML tags, and . There are a number of different Meta tags that you can use, but the most important ones are the Description and the Keywords Meta tags as well as having a title for the web page. Description This tag is used to give a short description of the contents of your web page,and is often used by search engines in the search results as a description of what your page contains.Example : official website,JY - is Spiritual youth movement dedicated to Christ"> Keywords To help get your web site up in the ratings you can supplement the title and description with a list of keywords, separated by commas.Most search engines will index the first 64 characters in this Meta tag. Example : content="Jesus,youth,JY,Emmavoos,Jesus Youth,Spiriual,Movement" > Revisit-After The revisit-after meta tag is useful for sites where the content changes often and tells the search engine how often to revisit your site. The example below will tell the search engine to revisit your site ever 31 days.. Example : Distribution Tells the search engine who the page is meant for and can be set to; global, for everyone, local, for regional sites, and UI, for Internal Use. Example : Robots This Meta tag is used is used to tell the search engine whether you want the web page indexed or not. The values for this tag are: - index(default) Index the page noindex Don't index the page nofollow Don't index any pages hyper-linked to this page none Same as "noindex, nofollow" Example : Meta Tag Example: Jesus Youth India Home page Page Content A number of search engines will index the first few lines of your web page, so try to make a the first few lines as descriptive as possible.Some search engines will even index your entire page. Hyper Links Try to place as many descriptive text links in your homepage to other relevant pages in your site as possible as search engines will use these links to index the other pages on your site.Usage : Issue21 Images Most sites these days contain images, so it is important that you use the ALT tag on any images to try and describe as much as possible what the image is of. Usage : my Magazine issue 21 Cover Image Links to the Site- URL Exchange Many search engines, including Google, will return your web site higher in search results by the amount of web sites that link to your site, also the higher the profile of the site that links to yours, the higher your listing is in search results again. Site Using Frames It's very difficult to get a good ranking on search engines if your Web site uses frames. The problem is that search engines do not index framed Web sites well. Actually, the search engines do such a poor job of indexing frames. If you must use frames on your Web site for some reason, make sure you use the NoFrames tag so that search engines can find some text to index. The NoFrames tag is a tag specifically for search engines that cannot read the actual pages in your frame set.

| 0 comments ]

Search engine optimisation tips

SEO techniques -Tips

You can find millions of Web site on the internet and its number is fast growing.In such a scenario We need to think about the possibilities of some pretty good strategies that make your site viewable to the Web world.

What is a Web Search Engine

A Web search engine is a search engine designed to search for information on the World Wide Web. Information may consist of web pages, images and other types of files.Commonly used search engines are Yahoo,Google,Msn,Altavista....

How Web Search Engines വര്ക്ക്

A search engine operates, in the following order

1. Web crawling

2. Indexing

3. Searching

Web search engines work by storing information about many web pages, which they retrieve from the WWW itself. These pages are retrieved by a Web crawler (sometimes also known as a spider) An automated Web browser which follows every link it sees. The contents of each page are then analyzed to determine how it should be indexed (for example, words are extracted from the titles, headings, or special fields called meta tags). Data about web pages are stored in an index database for use in later queries. Some search engines, such as Google, store all or part of the source page (referred to as a cache) as well as information about the web pages, whereas others, such as AltaVista, store every word of every page they find.

When a user enters a query into a search engine (typically by using key words), the engine examines its index and provides a listing of best-matching web pages according to its criteria, usually with a short summary containing the document's title and sometimes parts of the text.

Crawlers A web crawler is a program which automatically traverses the web by downloading documents and following links from page to page . They are mainly used by web search engines to gather data for indexing. Web crawlers are also known as spiders, robots, bots etc.

How Crawlers/Spiders work

Crawler-based search engines have three major elements. First is the spider, also called the crawler. The spider visits a web page, reads it, and then follows links to other pages within the site. This is what it means when someone refers to a site being "spidered" or "crawled." The spider returns to the site on a regular basis, such as every month or two, to look for changes.
Everything the spider finds goes into the second part of the search engine, the index. The index, sometimes called the catalog, is like a giant book containing a copy of every web page that the spider finds. If a web page changes, then this book is updated with new information.
Sometimes it can take a while for new pages or changes that the spider finds to be added to the index. Thus, a web page may have been "spidered" but not yet "indexed." Until it is indexed -- added to the index -- it is not available to those searching with the search engine.
Search engine software is the third part of a search engine. This is the program that sifts through the millions of pages recorded in the index to find matches to a search and rank them in order of what it believes is most relevant.

How to exclude site pages from Indexing

Exclusions can be made by the use of robots.txt. Based on the specifications in robot.txt
the specified files or directory will stay hidden from Indexing

A Sample robot.txt file

Here is what your robots.txt file should look like; _____________________________________________________________

# Robots.txt file created by http://www.webtoolcentral.com

# For domain: http://192.168.0.213
# All robots will spider the domain User-agent: * Disallow:
# Disallow Crawler V 0.2.1 admin@crawler.de
User-agent: Crawler V 0.2.1 admin@crawler.de Disallow: /
# Disallow Scooter/1.0 User-agent: Scooter/1.0 Disallow: /
# Disallow directory /cgi-bin/ User-agent: * Disallow: /cgi-bin/
# Disallow directory /images/ User-agent: * Disallow: /images/
______________________________________________________________
put this file in your root directory..