Promotion
- Rate your site for appropriateness to various age levels and sensibilities to ensure that access to it will not be blocked in error.
- Validate the code on your site to ensure that the search engines will parse it properly and that your users will not experience errors.
- Make your site as friendly to the search engines as possible.
- Arrange for a domain name if one is needed.

Preparation
Armed with a list of keywords, you next want to optimize your ranking for such searches. Although the various search engines differ in their approach, the following rules will give you good results. They are listed in order of importance, with the most critical first:
- Use the main key word in the page name.
- Use the key words in your TITLE tag.
- Use the key words in your KEYWORDS meta tag.
- Use the key words in your DESCRIPTION meta tag.
- Keep meta tags under 255 characters or they may be ignored. That limit is part of the HTML standard.
- Use the keywords in H tags (H1, H2, etc.)
- Use the key words in the body of your pages, especially near the top of your page.
- Use the key words in the ALT attribute of your IMG tags.
- Either avoid using frames, or make extensive use of the NOFRAMES tag.
- Get other sites to link to yours.
- Try to optimize the ratio of key words in your body text to the rest of the words.

sitemap.xml
Web crawlers usually discover pages from links within the site and from other sites. However, you may not have links between every page in your site in a form that the crawlers can find. This is especially likely if you use Java, Javascript, or server-side navigation controls or menus. Having a sitemap.xml file does not guarantee that web pages are included in search engines, but it ensures that the listed pages will be crawled. It also enables you to submit the sitemap.xml file to the major search engines instead of submitting each page of your site individually. The file can be created in any plain text editor (like Notepad). It needs to have these lines at the top:
<?xml version='1.0' encoding='UTF-8'?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<?xml version='1.0' encoding='UTF-8'?>
<urlset xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9
http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd"
xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
To simply list the URL of each page, list each one on a separate line using the following format:
<url><loc>http://webauthoring.tercenim.com/index.htm</loc></url>>
The file needs to end with the line:
</urlset>
If you want to include the optional metadata, please refer to the official protocol site for the details.

robots.txt
- It tells the spiders which, if any, files and directories you do not want indexed. For example, probably don't want your script and CSS files indexed. It puts an unnecessary load on your server and has no benefit. You may not want your images indexed if they are copyrighted and you don't want them turning up in image search results.
- It tells the search engines where to find your XML site map file.
- It prevents your server from logging a 404 (file not found) error every time a spider crawls your site.
The first line of the file should point your XML sitemap file. You must use the full URL of the file. For example:
Sitemap: http://webauthoring.tercenim.com/sitemap.xml
The next line specifies a particular spider by name. If, as is most often the case, you want to provide the same instructions to all spiders,
you use an asterisk (*) as a wildcard. For example:
User-agent: *
Disallow: /scripts/
Note that you must use a trailing slash on the directory name. Some spiders recognize wildcards on the Disallow lines, but many do not. Therefore I recommend that you specify every individual file by name that you want to exclude, and do not do something like this:
Disallow: *.css
Disallow: *.js
You can have as many Disallow lines as you like for a spider, and as many separate spider sections as you like. You can only list one file or directory on each Disallow line. Just leave a blank line between a Disallow line and the User Agent line that starts the next section. Complete information about the robots.txt file can be found at The Web Robots Pages. If you want all your files and directories to be accessed by spiders, use a single Disallow line with nothing listed, like this:
User-agent: *
Disallow:

If you would rather enter your choices into a form and have it spit out a complete and valid robots.txt file, this is just what you are looking for. And of course, it's free. They also have plenty of other useful tools and articles that you may find interesting.

Resources
Before you can enter optimize your site for the search engines, you need to determine what keywords are most useful for your site. This tool will show you approximately how many daily searches your keywords would get. It also shows results for other related keywords so you can find ones that will produce better results than the ones you entered.

Search Engine Marketing 101: This site covers search engine submission and registration issues. It explains how search engines find and rank web pages, with an emphasis on what webmasters can do to improve their search engine placement by using better page design, HTML meta tags, and other tips.
This page outlines the entire process of optimizing your site for the search engines and includes specific information about the major ones. It is a part of a larger site covering broader search engine marketing topics. Everything from search engine marketing strategy to search engine placement tactics, awards, email, press releases, links, suggestions for optimal search engine submission, and Search Engine Optimization. They sell SEO services, but provide huge amounts of information for free.

You'll find links to thousands of on-line articles about effective Web marketing and to on-line resources for business.
This free service makes it easy for visitors to add your site to their favorite social or private bookmarking site. It is what powers the Bookmark button in the upper-right corner of this page.

META Tags
These control the description of your page that will be used by many of the search engines, provide information about who wrote the page, the language of the text, and many other attributes.
They also provide some instructions to the browser as to how to best interpret your page.

There are many tags that can go in the "HEAD" section of a web page. This site tells you all about them and provides a form that will generate most of them for you so you can just copy and paste them into your page.
This is another META tag generator. It generates some tags that Dr. Clue does not, but misses some that it does. I recommend you use both.

Domain Name
Many give preferential ranking to URLs that have a keyword-relevant domain name like "www.FreeOrCheap.com" or "www.WebAuthoring.com".


The Dynamic DNS service allows you to alias a dynamic IP address to a static hostname, allowing your computer
to be more easily accessed from various locations on the Internet. The WebHop Redirection service provides web
redirection services to complement their dynamic and static DNS services. The allows you to alias your long,
hard-to-remember, ugly URLs to a short top-level hostname within one of our offered subdomains. The services
are provided for free, financed mostly by donations from grateful users.
This is useful if you want to host a website on your own PC, but your ISP changes
your IP address every time you connect. Web redirection provides a virtual hostname in your choice of a small
set of domains so you can promote "www.yourname.dyndns.org" instead of your site's true and far longer URL.