Open balteravishay opened 1 year ago
Rust has unsafe blocks - which bypass many of the Compiler's safety checks (though not all of them). The Rustonomicon is an excellent resource.
Many memory-safe languages have some sort of escape clause (e.g., Rust unsafe, Ada's similar pragma, and many languages can call to C or C++ code). I guess the question is, "is it worth it?"
It's possible to write a large program with safeties disabled at all times, but that's not the intended use. Typically unsafe code is an extremely small portion of the code, so it can be easily reviewed by many people to ensure it's fine. It might be worth doing anyway, but it's not entirely clear. I don't know how to answer that question, but that might be a good place to start.
You might want to sample some unsafe code fragments to see if they're where the risks are, and which risks matter. That might help guide your investigation. I could imagine the main issue being a separate compiler warning if a module/program has exceeded some threshold of unsafe code.
Regarding sample unsafe code fragments (the Rustonomicon makes this point too, but it's worth pulling out explicitly): the safety boundary for unsafe code in Rust is the module boundary due to Rust's privacy rules (you provide a safe API by hiding the relevant unsafe API calls or unsafe data like raw pointers through privacy). So examining snippets of unsafe code in Rust generally means needing to assess the entire module in which the unsafe code appears, as any code in that module has potential access to private data and can modify it in ways which impact safety.
The purpose of this initiative is to explore the necessity for supplementary guides to accompany the C/C++ Binary Hardening guide in other programming languages.
Our intention is to examine compilers for memory-safe-by-default languages, including Rust, Go, Java, Python, .NET, and more. We aim to determine if it's possible to compile code in such a way that the resulting binary may exhibit memory unsafe or undefined behaviors.