wxWidgets / Phoenix

wxPython's Project Phoenix. A new implementation of wxPython, better, stronger, faster than he was before.
http://wxpython.org/
2.23k stars 510 forks source link

MSVC build environment. #2085

Closed kdschlosser closed 11 months ago

kdschlosser commented 2 years ago

using setuptools or distutils to set up an MSVC build environment has been flaky at best. It doesn't always work correctly because of Microsoft's inability to stick to a single kind of an installation ecosystem. Starting with Visual Studio 2017 Microsoft added a series of COM interfaces to Visual Studio that allows for programmatically enumerating all Visual Studio installations. This coupled with registry keys and known locations on the system drive a proper build environment is able to be set up without using any of the vcvars*.bat files. Those bat files do not always work properly and they should not be used to set up the build environment. This is what setuptools and distutils use to create the build environment and that is the reason why it does not work 100% of the time.

wxPython is now able to be built on a Windows 7 computer that is running Visual Studio 2019+ where as it was limited to Windows 10+ computers before. This was due to it using bash.exe which is apart of the WSL (Windows Subsystem for Linux) which is only available on Windows 10+. There is another way... Seeing as how most users that would want to build wxPython on a Windows computer are probably going to have Git installed why not use the bash.exe that is included with that??? So I added in detecting if Git is installed and collecting the installation path from the registry and use that bash.exe if WSL is not available.

kdschlosser commented 2 years ago

Using the /MP cl.exe switch does not do what most people think it does. It does not run CL.exe in parallel technically speaking. It is going to compile solutions in parallel. This is bad in most case because the memory consumption ends up being enormous and there can be dependency problems.

What If I told you I could compile a single solution using say a threadripper 16 core process and all 32 threads (logical CPUs) would be used when compiling that single solution.. Far less RAM gets used and it is faster then compiling solutions in parallel. It also solves the possible issue of a build failing due to a solution getting compiled that is dependent on another solution that has not finished compiling. Only one solution at a time would be compiled any dependencies become a non issue.

This would work cross platform as well and would not be just for Windows.

marekr commented 2 years ago

I use KiCad for my PCB designs. I thought you guys use QT for the GUI framework.

You can use Python to compile your entire program. I never check to see if KiCAD is cross platform or not. If it is it can still be compiled using python. I have a framework that I wrote that does a really good job of locating compilers on different operating systems.

I did know KiCAD has Python scripting support but never investigated how much KiCAD relied on Python. I am really curious, probably because I use KiCAD. If you want to have a more in depth conversation about build systems or if you want to know anything about this MSVC script I wrote let me know. Tell me where to go to have have the conversation and I would be more then happy to.

Nope, we can't use QT for legacy reasons, mainly so many hard dependencies. Long long term, maybe years of manhours we could move to QT but at the same time, we are currently happy being able to maintain wxWidgets. QT in comparison would be a nightmare and in my personal experience in the last 10 years, I rather headdesk myself than try and upstream another 1 line bugfix to qt.

python's build support is funky. As is, if MSVC updates in advance of hardcoded expectations in python, it will basically refuse to compile.

This is why we use CMake, all the problems are handled, and with vcpkg maintained by Microsoft we have the entire cross platform ecosystem handled for C\C++ libraries. Python is irrelevant, slow and an maintenance headache for that purpose. And you writing a custom Python toolchain is great and all as an exercise but the universal world standard for C/C++ is CMake.

But that's not the topic and reason I commented here.

Using the /MP cl.exe switch does not do what most people think it does. It does not run CL.exe in parallel technically speaking. It is going to compile solutions in parallel. This is bad in most case because the memory consumption ends up being enormous and there can be dependency problems.

I think you misunderstand entirely what I was commenting on. Somebody mentioned about "simplifying and improving the build system".

I brought up the fact that wxpython can be thought as two steps, "preprocessing" which needs python and the actual "build" which does not.

With that logic, the build chain can be simplified quite a bit with the usage of CMake.

Now on the /MP topic, when CMake does parallelization for MSVC, it does not use /MP. It uses Ninja. Even Microsoft officially uses Ninja under CMake by default now.

/MP and MSVC msbuild/slns/cprojs are ancient history ;)

kdschlosser commented 1 year ago

I agree with you as far as the CMAKE and Ninja goes. unfortunately wxWidgets is set up to compile using NMAKE and I don't believe the CMAKE files have been kept up to date. I have tried to get it to compile properly using CMAKE and Ninja and have not had any success in doing so.

marekr commented 1 year ago

I agree with you as far as the CMAKE and Ninja goes. unfortunately wxWidgets is set up to compile using NMAKE and I don't believe the CMAKE files have been kept up to date

wx 3.1.4 and up had cmake support added and is actively maintained. nmake is not used (and I don't remember it ever being used)

kdschlosser commented 1 year ago

NMAKE is what gets used when compiling wxWidgets in wxPython. If I change to using CMAKE the compile fails.

RobinD42 commented 1 year ago

This pull request has been mentioned on Discuss wxPython. There might be relevant details there:

https://discuss.wxpython.org/t/building-phoenix-on-windows/36435/4

kdschlosser commented 1 year ago

I wanted to make a note of something. This "bug" in Setuptools will NEVER be fixed because in their eyes it is not a bug. You have to read the "fine print" with setuptools. They documentation just plain stinks so one has to resort to looking at the source code for it. This is what I actually did recently and stumbled upon this.

Improved support for Microsoft Visual C++ compilers.
Known supported compilers:
--------------------------
Microsoft Visual C++ 14.X:
    Microsoft Visual C++ Build Tools 2015 (x86, x64, arm)
    Microsoft Visual Studio Build Tools 2017 (x86, x64, arm, arm64)
    Microsoft Visual Studio Build Tools 2019 (x86, x64, arm, arm64)
This may also support compilers shipped with compatible Visual Studio versions.

read that carefully. It states that setuptools DOES NOT support Visual Studio or Visual C++. It only supports Build Tools.. Build tools is a different animal than Visual Studio and because setuptools relies on hard coded paths for both the registry and filesystem it is not going to be able to locate Visual Studio properly, especially the new versions.

Using vswhere that comes with Visual Studio is not a viable solution in my opinion because once again the reliance of hard coded paths in order to locate where it is installed to be able to run it. If you are able to locate vswhere then you have already found the installation of Visual Studio. That in and of itself makes it pointless to use.,

If the decision was made to only allow using VS 2017 or newer then quite a bit of the code is able to be removed from the script I wrote.

I did also want to note that I added the code to pypi as pyMSVC. It can be added to the build system using the toml file and it will be downloaded when a windows build is being done. This means the build system that is currently in place needs to have a makeover so it gets run from setup.py directly and not through a subprocess.

Me personally I will avoid using subprocess as much as possible. The output may change from the process being run and or errors are not handled properly inside of the subprocess which can cause issues. With windows systems you have orphaned process issues on top of that. Using subprocess has caused me some headache in the past. For whatever reason the program freezes or crashes the subprocess could end up getting detached from the main process. This has caused hung network ports that can only be released by rebooting the computer.

It is also no mystery that the majority of the developers for Python and the build systems for python are mostly Linux folks. They do not put importance to getting the Windows side of things up to snuff on the top priority list. There are bugs in the socket module that have been there for over a decade. So they don't exactly rush to get Windows based issues fixed, a fantastic example is the locale module that still doesn't work properly.

My script still works with zero bug reports in the last year. It is being used by people that compile Cython code, There is a link to it in their documentation. So it's not a question of reliability or if the code is sound or not. The question of maintainability pretty much went out the window as well. a full year of being on pypi and not one thing has needed to be done to it in terms of fixing a bug or updating it to work with new versions of VS. I am going to update it to add python 3.11 to the automatic compiler detection. The script does still work with Python 3.11 it just has to have the MSVC compiler versions passed to the setup function.

kdschlosser commented 1 year ago

I did also want to mention the importance of using the exact same MSVC compiler version that was used to compile CPython to compile extension modules. This needs to be done because of changes to the C runtime code not being 100% compatible between MSVC versions.

3.6 = 14.0 - ? 3.7 = 14.14 - ? 3.8 = 14.14 - ? 3.9 = 14.14 - ? 3.10 = 14.14 - 14.29 ? 3.11 = 14.33

What is important to keep the same is the c runtime version.

https://en.wikipedia.org/wiki/Microsoft_Visual_C%2B%2B

when this code gets run

import platform

print(platform.python_compiler())

the output is along the lines of

3.8.10
MSC v.1928 64 bit (AMD64)

it tells you the MSVC version used to compile. Here is is 1928 which uses the the 14.28 c runtime. That means to make sure the compilation works properly for an extension that these VS versions need to be used 16.8.1, 16.8.2 and 16.9.2 which is VS2019. But not all VS2019 versions use MSVC 1928.

the funny thing is this is how not up to date information is for Python. Right in the python documentation it states this..

For a concise step by step summary of building Python on Windows, you can read Victor Stinner’s guide.

https://devguide.python.org/getting-started/setup-building/#windows-compiling

It is pulling information form the waybackmachine so that tells you how up to date it is. But they specifically mention Victor Stinner. It just so happens he has some information that is more up to date.

https://pythondev.readthedocs.io/windows.html#python-and-visual-studio-version-matrix

and even that is really far off. Look at what it says for Python 3.8.. VS 2017. Yet here we have 3.8.10 that has been compiled using VS2019.

While Python is able to be compiled using VS2017 or newer an extension module should be compiled using the exact same compiler that Python was compiled with. This is the only way of ensuring 100% compatibility between the extension module and python. setuptools is never going to be set up to do this properly. Because the authors feel that they have provided enough support for compiling with only supporting build tools versions 2015, 2017 AND 2019 even tho the binary release of Python is being compiled using VS 2022.

marekr commented 1 year ago

This needs to be done because of changes to the C runtime code not being 100% compatible between MSVC versions.

This hasn't been true since MSVC 2015. https://devblogs.microsoft.com/cppblog/the-great-c-runtime-crt-refactoring/

The 2015+ now use the UCRT (universal c runtime). Basically in the past vcruntime1XX was a thing. Ever since 2015, vcruntime140 is now the only one and the name nor version ever changes. Any "old" apps built in MSVC 2015 will happily use a newer 2023 updated version of the same vcruntime140.

kdschlosser commented 1 year ago

https://learn.microsoft.com/en-us/cpp/porting/binary-compat-2015-2017?view=msvc-170

You can mix binaries built by different versions of the v140, v141, v142, and v143 toolsets. However, you must link by using a toolset at least as recent as the most recent binary in your app.

If Python is compiled using Visual Studio 2022 (which it is) you CANNOT compile the extension modules using VS 2019 you MUST use VS 2022 or newer.

If Python was compiled using VS 2017 then you CAN use VS 2022 to compile the extension modules.

Backwards compatibility exists not forward. So the rule of thumb is the same or newer version than what Python has been compiled with. Python binaries are compiled using version 14.3 of the c runtime. The terms build tools and tool sets are used interchangeably and is referring to the c runtime version.

https://wiki.python.org/moin/WindowsCompilers

At the time of this writing, CPython is built using VC++ 14.3 (Jan 2022).

14.3 is the first release of VS 2022 so only VS2022 should be used to compile extension modules for Python 3.11. IDK about any of the other versions of Python as I have not checked to see if they are being compiled using a different version of MSVC