Re: Why did this happen?


[ Follow Ups ] [ Post Followup ] [ California Scuba Diving BBS ] [ FAQ ]

Posted by floorcookie on May 16, 2001 at 13:03:14:

In Reply to: Re: Why did this happen? posted by Chris on May 15, 2001 at 19:50:34:

Chris, there actually is a way to stop/slow down at least some of the more ethical 'bots. I work as a webmaster so I know a little bit about it. Create a file called robots.txt and put it in your main root directory (http://diver.net/)

The file should be formatted as follows:


# /robots.txt file for http://diver.net/
# mail webmaster@diver.net

User-agent: *
Disallow: /bbs
Disallow: /any-other-folder-you-want


Most bots will look for the robots.txt file before scanning the site. They will not look into any folders that are in the "disallow" list. Of course there is nothing to stop someone from creating an "unethical" bot that ignores the robots.txt file. However, I have found that most of the "hackers" out there are actually idiots and couldn't program a thing to save their lives and are using commercial bots that they bought. These are mostly of the ethical variety and will check for the file.

-- floorcookie


Follow Ups:



Post a Followup

Name:
E-Mail:

Subject:

Comments:


[ Follow Ups ] [ Post Followup ] [ California Scuba Diving BBS ] [ FAQ ]