[Noisebridge-discuss] archiving a password protected website

John Adams jna at retina.net
Wed May 30 19:09:53 UTC 2012

I'm not sure why you would want to go through the trouble of running a

wget has solved this problem and can deal with authentication.

wget -mk http://foo.com

You can also dump the cookies from your existing browser session and do

          wget --no-cookies --header "Cookie: name=value"

...or wget --load-cookies with a cookie file.

If it's basic auth, set the Authorization: header.


On Wed, May 30, 2012 at 11:41 AM, David Rorex <drorex at gmail.com> wrote:

> On Sun, May 13, 2012 at 11:52 PM, Andy Isaacson <adi at hexapodia.org> wrote:
>> There Really Should Be A Tool (tm) that operates as a HTTP proxy and
>> simply writes out every HTTP resource that you retrieve to the
>> filesystem.  I don't know of one, alas.  You can build one using
>> mitmproxy or Perl's proxy module I suspect, but that sounds like work...
>> -andy
> http://www.charlesproxy.com/ does exactly this, if you turn on the
> 'Mirror' feature, any webpage & all of its resources will be downloaded to
> a local copy, in a directory structure matching the urls you visit. I
> bought it, use it for development, but I believe the trial version lets you
> use all features after a 10 second nag screen, and a time limit of 30
> minutes (but you can easily restart it after 30 minutes).
> _______________________________________________
> Noisebridge-discuss mailing list
> Noisebridge-discuss at lists.noisebridge.net
> https://www.noisebridge.net/mailman/listinfo/noisebridge-discuss
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.noisebridge.net/pipermail/noisebridge-discuss/attachments/20120530/b7e52ae8/attachment.html>

More information about the Noisebridge-discuss mailing list