boxcutter / windows

Virtual machine templates for Windows written in legacy JSON and Batch Scripting/JScript
Apache License 2.0
756 stars 264 forks source link

wget ignoring certificates #177

Closed dragetd closed 4 years ago

dragetd commented 5 years ago

I just realized that my PR #176 is not really helping, as most thing download while ignoring the certificate:

https://github.com/boxcutter/windows/blob/72c70ff96bc4a1fe1f2b6802c8d8839230f1450b/floppy/_download.cmd#L51

https://github.com/boxcutter/windows/blob/72c70ff96bc4a1fe1f2b6802c8d8839230f1450b/.windows/provisions/Makefile#L102

Downloads should not ignore the certificate. :-/

daxgames commented 5 years ago

@dragetd Are you a member of this project or just someone like me that uses it and wants to make it better?

You could add to the Makefile an option called SECURE_DOWNLOAD that defaults to 0 so the existing behavior is not changed.

People can then enable it if they want to.

Command line:

make SECURE_DOWNLOAD=1 virtualbox[target]

Makefile.local

SECURE_DOWNLOAD := 1

Then edit _download.bat to not inject the --no-check-certificate

if "%SECURE_DOWNLOAD%" == "0" (
    set WGET_SECURITY=--no-check-certificate
)

"%wget%" %WGET_SECURITY% %WGET_OPTS% -O "%filename%" "%url%" 

This would also require editing all [target].json files in the root to configure this new env var.

Look for how CM is set/used in the Makefile and how it is used(both upper and lower case) in the *.json files.

dragetd commented 5 years ago

I am just a user, trying to improve the existing projects. I even considered starting from scratch - https://github.com/dragetd/packerwin

But not sure when I will find time for it.

I tried rewriting all URLs with their correct https counterpart in #176, but it broke something on one side and on the other I realized the amount of hardcoded versions and repeated URLs. =/

Checking certificates should be the default with an optional possibility to disable it, if needed. But even then, what would be the use-case? If it is for some proxy, the CA can be trusted locally.

arizvisa commented 4 years ago

Hey guys, I was just invited to be a maintainer and plan on improving a number of things in this once I get through some of the PRs that have been lagging.

But in my personal fork of this project, I've been self-hosting some of these binaries which you can serve directly through Packer's HTTP server. Like despite this project not checking certificates when downloading binaries (which is of course a bad thing), I personally think that to (to begin with) it's worse that we're trusting some random host to download+execute binaries from.

In that way any one of us contributors can update the binary, or better yet include the binary's source as a git-submodule so that we can guarantee trust by allowing one to rebuild the binaries from source code if they want to.

Also, at some point I'd like users to be able to build templates on an air-gapped network by tweaking options if desired as some of us folk in infosec are forced into those types of environments.

How do you guys feel about these things? or would you simply just prefer including the CAs and cross-checking against them w/ wget ?

dragetd commented 4 years ago

Hey, great to hear someone is having a take on this!

So, you plan on putting the binaries into the repository? This is not exactly a good idea as it will litter and bloat the repository massively. Unless I got the idea wrong.

If we say we do not trust the local CA and want offline support, I'd rather prefer a default download directory with signatures/hashes as part of the repository. Either the file is downloaded or if the downloaded file is already there, it is just verified against the signature.

This would combine both advantages. :)

As I feared back then, I lack the time to really get into it. But I'd still love to hear your ideas and see how they line up with what I had in mind. Maybe you got some time to chat some day.

arizvisa commented 4 years ago

Yeah, not everything though. Just the core required tools like wget.exe/curl.exe for bootstrapping. I have an msvcrt.dll-linked curl.exe that I use across the windows environments which is only 488k. I mainly don't like how none of us can can do anything w/ that required download of wget.exe that none of us control. Also, like maybe 7zip doesn't exactly need to be installed as you can literally extract the statically linked binary and use that temporarily.

Despite these suggestions, I don't want to require users to have to build things so that the only default requirement is Packer, as in some cases they might not have GNU Make. But I would still like to expose that option of building to them if they prefer to, as I tend to work in isolated network environments.

Again tho, this is all up for discussion. After getting through some of the older PRs and closing the outdated or expired issues, I plan on writing up an issue and taking a break for a moment so that the community that still exists for this project can discuss its future or desired capabilities so we can figure out how far to take it.

I asked the maintainers if they needed help for this project as most of the original maintainers have moved towards development in the windows-ps repository. They allowed me to help so I'm attempting to wake up some of the goals without going too far out of scope from the original design of it.

My interest in this project is that I maintain a fork that does a number of the things that I've previously mentioned, and hence I'm more interested in improving the templates themselves and the initial provisioning that they do.

arizvisa commented 4 years ago

Lol. and actually speaking of which, I encountered the following while testing something else.

==> vmware-iso: This version of C:\Windows\wget.exe is not compatible with the version of Windows you're running. Check your computer's system information to see whether you need a x86 (32-bit) or x64 (64-bit) version of the program, and then contact the software publisher.
daxgames commented 4 years ago

Yeah I ran into that too.

dragetd commented 4 years ago

I think redesigning the bootstrapping of the tools is a general task that would also fix this issue… so for the sake of cleaning up, I'll close this issue (please reopen if you want).