Sander k
New member
How do you configure your robots.txt file?
Creating and configuring a robots.txt file for your Glype proxy is an essential step in the creation process and certainly a step you do not want to skip.
The robots.txt primary purpose is to exclude content from being crawled by robots typically used by search engines to index content.
The robots.txt protocol, is a convention to prevent cooperating robots/spiders from accessing all or a part of a website which is otherwise publicly viewable. Robots are often used by search engines to categorize and archive websites, or by webmasters to proofread source code.
To create a robots.txt file for your Glype proxy simply open up a text editor such as Notepad, copy the code snippet below and save it as a plain text file. Upload the file to the root of your hosting account (or where your Glype proxy script is located), you should be able to access the file in your web browser using this form of URL: http://www.domain.com/robots.txt
Creating and configuring a robots.txt file for your Glype proxy is an essential step in the creation process and certainly a step you do not want to skip.
The robots.txt primary purpose is to exclude content from being crawled by robots typically used by search engines to index content.
The robots.txt protocol, is a convention to prevent cooperating robots/spiders from accessing all or a part of a website which is otherwise publicly viewable. Robots are often used by search engines to categorize and archive websites, or by webmasters to proofread source code.
To create a robots.txt file for your Glype proxy simply open up a text editor such as Notepad, copy the code snippet below and save it as a plain text file. Upload the file to the root of your hosting account (or where your Glype proxy script is located), you should be able to access the file in your web browser using this form of URL: http://www.domain.com/robots.txt
Code:
User-agent: *
Disallow: /browse.php
Disallow: /tmp