Over on the tor-talk mailing list, Alec Muffet outlines the initial proof-of-concept for the Facebook .onion site:
Hi,
It rather depends upon the complexity of your “cleartext” website.
If you have a small, simple website with no cookies and where all of the
URLs are “relative” then maybe you could set up a Tor daemon and have the
hidden service to point at your usual webserver, but there would probably
be errors and without extensive testing/fixing the results are very likely
to be flaky.
For any complex website it would be worse.
We did our initial proof-of-concept using “mitmproxy” such that a Tor
daemon hosting a hidden service spoke to mitmproxy via localhost:443; this
terminated SSL and then a selection of commandline arguments rewrote the
content bidirectionally, such that:
- incoming request headers - Host/Referer/etc… - which referenced the
onion address were rewritten in terms of the normal web address
- outgoing json/javascript/css/html in the response body which referenced
the normal web address was rewritten in terms of the onion address
- outgoing response headers which referenced the normal web address were
rewritten in terms of the onion address
- outgoing cookies had their domains changed in terms of the onion address
- caching was disabled, because debugging.
…and then the traffic was then forwarded to the normal web address over a
new HTTPS connection.
Overall the proof-of-concept was a single shellscript containing
approximately a single commandline with perhaps ten options.
Most of the site functioned over the onion address via this mechanism, if
somewhat clunkily.
This experiment worked sufficiently well for us to see what modifications
needed to be done to the main website code to make a solution that was fit
for production.
https://lists.torproject.org/pipermail/tor-talk/2014-November/035680.html