Parckwart’s Computer Stuff

⟵ Home page

Published: February 28, 2017

Tor2web is a Mess

Tor2web is a web proxy software and part of the GlobaLeaks project to enable access to Tor onion services for regular Internet users. The idea is to simply set up web proxies that take a requested .onion URL, access the corresponding webpage over the Tor network and deliver it back to the user, who is using their regular non-tor-enabled web browser. In my opinion, this is a great idea, as were so many by Aaron Swartz, who published the original software back in 2008 together with Virgil Griffith. It’s a simple way for onion service providers benefit from some of the advantages that onion services provide while being reachable for unexperienced users. It’s an attempt to bridge the gap that comes from running seperate networks on the internet. It even allows onion services to be indexed by regular search engines. But why is the technical implementation so bad?

How it works in general

Contributors run the previously mentioned web proxies. They registered an intuitive domain (i.e. onion.tld) in the real DNS (except, which doesn’t follow the "onion scheme"). Additionally, they configured a wildcard subdomain, so requests to anything.onion.tld reach their proxies. The proxies take request URLs that reach them (i.e. https://parckwartvo7fskp.onion.tld/webpage.html), strip out the TLD and perform their own request to the actual onion service via plain HTTP, not HTTPS. Once the proxy gets a response from the onion service, it modifies the page a bit to adjust hyperlinks, image URLs etc. to the Tor2web URL and to show some warning to the user that they’re using Tor2web and then sends this page to them. That’s about all that can be said in general about Tor2web, because everything else is handled differently by the individual proxies.

Comparing the proxies

Here’s the main problem: Not all of these proxies are running the same software or even just different software that behaves in a roughly identical, agreed upon manner. While is not even using the official Tor2web software, it is still mentioned alongside proxies that do so on the Tor2web website. is not mentioned there, so it’s not "official", I guess.

I am going to compare the different Tor2web proxies in this article but first, I have to exclude two:

Detecting Tor2web

This issue affects me personally, because the very site this article is hosted on has an onion service (parckwartvo7fskp.onion) but also a regular domain on the internet ( So do many, like DuckDuckGo, Facebook or the Debian Project. Especially for sites like Facebook, it is unacceptable to have some third-party web proxy between the user and the web server, as they are dealing with sensitive and personal data. It also ruins your search engine ranking, as your real website has to compete with many identical looking ones with different domains. So there should be a way to detect, if a user is connecting using Tor2web in order to block or redirect such requests.

The Tor2web developers have thought about this and implemented a HTTP header named X-Tor2web which the proxy sends to the onion service web server. Facebook as well as my site use this header to detect and block Tor2web. But not all Tor2web proxies actually send this header:

There are six more headers included in this table:

Using easy methods, I know of no way to detect Chloe came up with some ideas for by exploiting bugs of the proxy or using JavaScript on the client side, but that’s a really awful solution.

User agent header and change the User-Agent header to make the request look like it’s comming from Tor browser. Currently Tor browser uses the UA of Firefox running on Windows 7, no matter, which platform it’s actually running on. Both proxies however use the UA headers of old versions of TBB:

Response headers

All proxies change the response HTTP headers to some extend.
Keep original headers
partial partial partial
Redirect Tor users
wrong mostly keeps the original headers from the web server.

When looking at the response headers from and, it becomes clear that these two are not running the original Tor2web software, but just some self-configured nginx., and replace the original Server header with their own nginx version. ignores the original headers and completely replaces them with it’s own ones. This is also another security risk since it strips out all security-related headers like Content-Security-Policy, X-Frame-Options or X-XSS-Protection. and insert a header named X-Check-Tor, which is set to false unless the user comes from a Tor exit node IP address, in which case they are being redirected to the orignal onion service domain anyway. also sends the header, but for me it always contains true, even if I don’t connect over Tor. The proxy never redirects me though. only seems to keep some headers like the Server header. Others are removed like with

Cloudflare tracking

The home page of (by which I mean only, not is hosted using Cloudflare’s CDN and is down at the time of this article’s creation. If you visit it, you’re getting Cloudflare’s cfduid cookie containing a unique ID for the user, which will then be sent to every onion service you browse using, which results in excellent trackability across all onion services.

Even more tracking shows an error page if a connection to the requested onion service cannot be made. This page shows advertisements for HideMyAss, who load JavaScript from their servers. The user also gets cookies from HMA and they can detemine, what onion service the user wanted to visit. Cooperating with HideMyAss doesn’t really seem to be compatible with Tor’s philosophy to me.

HideMyAss advertisement on <i></i>
HideMyAss advertisement on sets three cookies, two of which seem to be user-unique: _pk_id.4.c2e2 and onion_cab_iKnowShit. At least, these are not sent to to the onion service like it is with’s Cloudflare cookie. Additionally, uses Piwik, a JavaScript tracking analytics platform. also sets a cookie named _PSI containing a unique ID. This one however is being sent to the onion service, which - again - results in excellent trackability across all onions services a user vitits through the proxy.

The trophy for the worst privacy of all the proxies goes to, which injects a whole bunch of ads into every single web page. It even bombards the user with pop-up windows.

<i></i> injects ads into web pages injects ads into web pages

JavaScript, images etc. are being loaded from countless domains including such classics as or

uMatrix on <i></i>
uMatrix on

It should be mentioned that every single one of these sites can determine what onion service the user is browsing and are theroratically capable of serious attacks on sensisitve data the web page might be handling.

TLS at the onion side

Since there are official TLS certificates for .onion domains available, some sites enforce HTTPS. These only work with doesn’t respond to such requests at all, the other proxies result in an infinite redirection loop.


A good way to detect a real Tor2web node (as in: it’s running the actual Tor2web software) is to check if it offers OpenData statistics by trying to access https://onion.tld/antanistaticmap/stats/yesterday. A "real" node should report statistics about accessed onion services, a "fake" one should give a 404 error.

As probably already expected:, and are real, the other four are fake.

How I think it should be

I think that the other proxies should take as an example. It got the best results in my comparisons. In my opinion it would be really nice if these operators would come together and agree on some standards.

At the very least, every proxy should send the X-Tor2web header. There must be a way to redirect Tor2web users or to lock them out, for which this header is absolutely necessary. It would be even better if they would all have a common blacklist for blocked onion services. Currently, they all have their own ones and they only block illegal services, not those, who do not wish to work with Tor2web. You could also think of a HTTP header which the onion web server sends and tells the proxy that it should not allow the request. Or perhaps something like a robots.txt file, but for Tor2web blocking or redirecting.

Because is competing with my real website in search engine results, I actually tried reporting my own site there, with an explaination why I am upset about and the fact, that it is virtually undetectable. I haven’t heard from it’s provider since and my site is not on the blocklist. I haven’t tried uploading underage pornography to my site and then reporting it. Although it might work, I highly discourage anyone to do so.

Tracking ads like with or especially are totally unacceptable to me. Especially if the ads’ cookies are being sent to the onion service, which can then track the user as well. The same thing is true for’s Cloudflare cookies or’s _PSI cookie. Although Tor2web is not very anonymous, there’s no reason to make it any worse than necessary, which is why I also dislike the X-Real-IP header.

Taking, easily the worst of all the proxies, into my comparisons might have not been entirely fair. It isn’t mentioned anywhere in the GlobaLeaks docs and is run by some guy who has no respect for people’s privacy. But is mentioned. And that’s bad enough already.,, and aren’t mentioned either, from what I can tell. It might be good idea to directly warn about providers like in the Tor2web documentation.

In conclusion, I really am a supporter of the general Tor2web idea and I’m just upset about it’s current poor landscape. I wish, the topics privacy and security would be taken more seriously.