Robots.txt for Glype

Sander k

New member
How do you configure your robots.txt file?

Creating and configuring a robots.txt file for your Glype proxy is an essential step in the creation process and certainly a step you do not want to skip.

The robots.txt primary purpose is to exclude content from being crawled by robots typically used by search engines to index content.

The robots.txt protocol, is a convention to prevent cooperating robots/spiders from accessing all or a part of a website which is otherwise publicly viewable. Robots are often used by search engines to categorize and archive websites, or by webmasters to proofread source code.

To create a robots.txt file for your Glype proxy simply open up a text editor such as Notepad, copy the code snippet below and save it as a plain text file. Upload the file to the root of your hosting account (or where your Glype proxy script is located), you should be able to access the file in your web browser using this form of URL: http://www.domain.com/robots.txt

Code:
User-agent: *
Disallow: /browse.php
Disallow: /tmp
 

Genesis

Administrator
Staff member
@Sander If you use material from another Website you need to give credit to it by using the quotation tool and citing the source. Looks as though the content of your post originated from an Instant Proxy article.
 

Sander k

New member
Its an article I bought in 2009, one of thousands SEO related articles. I guess every 200 or so guys who bought them could have published it.
 

Genesis

Administrator
Staff member
True. But even if you bought the article, the article copyright still remains with the author.
 

PeaceSigns

New member
Be very careful with this as if you block Google, it will take ages (seriously weeks to months) for the issue to be corrected.
 

Genesis

Administrator
Staff member
Wonder whether that could potentially happen one day? A PRO hacker wanting to do major damage to Google? Guess Google has bought out all the PRO hackers there are?
 

PeaceSigns

New member
They have multiple datacenters that are not syncronized (I know this, because you can search for results across their individual data centers and see how different the results are across the boxes) and I'm sure they have a team watching the data 24/7.
 

Genesis

Administrator
Staff member
PeaceSigns said:
They have multiple datacenters that are not syncronized (I know this, because you can search for results across their individual data centers and see how different the results are across the boxes) and I'm sure they have a team watching the data 24/7.
I'm sure they have every security measure in place, and then some more. :cool:
 

PeaceSigns

New member
I think it was Google who was buying a data center in Alaska to save on costs (both space so they can expand with more server racks + cooling costs).
 

Genesis

Administrator
Staff member
Aha .... Alaska is not far from where I am right now. Wonder whether that would make a difference when one is closer to their data centers?
 

PeaceSigns

New member
Perhaps a speed difference, but I think Google detects your IP and takes you to a country specific version of the search. So if you're in Canada right now, you're going to get Canadian results - but if you cross the border north to Alaska, you're gonna get US search results.
 

Genesis

Administrator
Staff member
Well that one is not rocket science Peace. You can't imagine how IRRITATING it is when I continue to get Arabic default in the UAE when I open up Google, and every time have to click on Google.com - and I'm sure even with that, my search results may be different. One of the things on my to do list here in Vancouver is to get a decent proxy. So it will default to the US every time. I can't get the proxy in the UAE as all proxy Websites are blocked.

I meant more along the lines of how fast their Search engine would be. I.e. would it be faster if one is closer to Alaska?
 

PeaceSigns

New member
Just visit google.com/ncr - that last part creates a cookies so you always stay on google.com and don't get redirected to the UAE version.

Speaking of which - is the UAE version really that different - I mean censorship wise?
 

Genesis

Administrator
Staff member
Google is not censoring anything, but they probably use filters that cater for the UAE and Middle East search results. From the Internet Service provider side there is enormous censorship. Plenty of blocked sites. Including proxy Websites. On the one hand one gets a dynamic line (with shorter intervals than international standard) service that one shares with hundreds of others, including spammers. To get a static line is much too expensive and is usually limited to business. And then to top it all one doesn't have a legal option to use a proxy service. A great number of people do have a proxy service of course as that does make things more secure when navigating the Internet. I may be doing it myself and should get my act together with finding a service now while I'm in Canada.