Prevent off line use of site

liunx

Guest
Some people like to Download <!--more--> your site so they have it available all the time (for instance with 'HTTrack website copier''). The problem with this is that they are eating my bandwith and I can get charged for to much data traffic. Is there a way of preventing this kind of actions.<br />
I saw a message about a file called robot.txt but can't remember the details. Is this related to it?<br />
<br />
Thanks,<br />
<br />
Hanno<!--content-->The file (robot.txt) relates to search engine bots and it contains info about how to index your site.<br />
<br />
Also, you can't block people from d/ling your site.<!--content-->Ok, after browing through HTTrack I noticed this:<br />
<br />
<!-- m --><a class="postlink" href="http://www.httrack.com/HelpHtml/abuse.html#WEBMASTERS">http://www.httrack.com/HelpHtml/abuse.html#WEBMASTERS</a><!-- m --><!--content-->MArk,<br />
<br />
I'll go through the options mentioned in your link. I will only pick out an easy and non aggressive one.<br />
<br />
I am getting more curious at the robot.txt file. I think I will post that on the 'search engine' forum.<br />
<br />
Thanks,<br />
<br />
Hanno<!--content-->heres a few links about robot.txt files<br />
<br />
<!-- m --><a class="postlink" href="http://www.webtoolcentral.com/webmaster/tools/robots_txt_file_generator/">http://www.webtoolcentral.com/webmaster ... generator/</a><!-- m --><br />
<br />
I did have a real good site that explained robot.txt files, but for the life of me can't find it. if I do I will add it to my sites tutorials.<!--content-->
 
Back
Top