luxonis / depthai-core

DepthAI C++ Library
MIT License
238 stars 127 forks source link

unexpected use of CMAKE_TOOLCHAIN_FILE in cmake, recommend not using it #72

Open diablodale opened 3 years ago

diablodale commented 3 years ago

The current wipe of CMAKE_TOOLCHAIN_FILE in the project's CMakeLists.txt is unexpected. As coded today, its only action is to set(CMAKE_POSITION_INDEPENDENT_CODE ON) when BUILD_SHARED_LIBS=ON.

It is my understanding that CMAKE_TOOLCHAIN_FILE is for other uses. As per cmake docs

path to a file which is read early in the CMake run and which specifies locations for compilers and toolchain utilities, and other target platform and compiler related information

For example, I need to set CMAKE_TOOLCHAIN_FILE to my vcpkg installation so that this project can find that managed installation of OpenCV which is compiled for specific variant, flags, etc. This is done by specifying CMAKE_TOOLCHAIN_FILE on the command line as a flag to cmake (or in settings of an IDE that are passed to cmake).

Setup

Recommendation

Move set(CMAKE_POSITION_INDEPENDENT_CODE ON) to a place unrelated to CMAKE_TOOLCHAIN_FILE. In my view, it is unrelated to the cmake toolchain feature and instead a specific compiler/linker need of the depthai-core project like specifying the c++11 standards with no extensions, or pthread preference.

Workaround

Always edit this project's CMakeLists.txt to remove this wipe of CMAKE_TOOLCHAIN_FILE and replicate the set(CMAKE_POSITION_INDEPENDENT_CODE ON) elsewhere if shared libraries are built.

themarpe commented 3 years ago

CMAKE_TOOLCHAIN_FILE is set as a default to specify building of position independent code for 3rdparty dependencies (Hunter).

If CMAKE_TOOLCHAIN_FILE is set manually using CMake -D CMAKE_TOOLCHAIN_FILE=... it is going to be respected instead.

Your use case (as well as the others) would work as expected.

diablodale commented 3 years ago

Unfortunately, I believe the combination of Hunter and a dev chosen CMAKE_TOOLCHAIN_FILE are incompatible. I can not build this project without hacking this project's CMakeLists.txt.

It is all related to the toolchain, finding opencv, and opencv's dependencies. As is, the current solution in this project+Hunter doesn't work on Windows, Visual Studio, cmake, opencv static. Failures are...

Hunter requires HunterGate() before project() command. Yet most of cmake statements will fail until after project(). huntergate() goes and does things that cause TOOLCHAIN activity, and in TOOLCHAIN are calls like set() and find_package(). But find_package() will fail because it would be run before project() due to huntergate(). cmake critically fails with "Error required internal CMake variable not set, cmake may not be built correctly. Missing variable is: CMAKE_FIND_LIBRARY_PREFIXES"

As it is today, I don't see devs to have our own TOOLCHAIN file because of Hunter. The only workaround I see is to hack CMakeLists.txt after every pull from this repo. All the things that should be in the toolchain file have to be put into the CMakeLists.txt after project().

Maybe there is a way to workaround hunter's limitations. Is there a way that CMakeLists.txt can find opencv, and then iterate through all the imported dependencies and then find_package() on those? I don't know and didn't find anything directly of that topic with google.🤷‍♀️

themarpe commented 3 years ago

@diablodale

WRT toolchain including find_package and the likes, not sure if I saw that kind of usage before so I wouldn't know.

Are you using the library in conjunction with another package manager / environment, or just building on its own?

diablodale commented 3 years ago

The OP I wrote in a generic sense of some developer somewhere that wants to use their own toolchain. Naturally, I'm a subset. ;-) My very specific use related to this is:

So when this scenario now includes depthai, this means for me to use depthai with the above I need to compile depthai libraries myself...

  1. compile depthai with my private compiled/installed OpenCV. I need to somewhere set OpenCV_DIR=xxx. This can be done on the cmake command line as a cmake define. depthai's CMakeLists.txt already does find_package(opencv) so all it good here.
  2. link to all dependencies of my private installed OpenCV. Remember my opencv install is of static libraries. So all the libraries that opencv itself is dependent...now need to be linked. The minimum set given depthai's usage of opencv are Eigen and IPP. This is the problem area.

How can depthai's cmake infrastructure find and load the packages for Eigen and IPP? Typically this is done by writing the cmake code in a file and then include() that file or use CMAKE_TOOLCHAIN_FILE. And that code in the separate file will have set(), find_package(), find_library(), or other such cmake mechanisms to find/load those packages.

However, huntergate() forces itself to be before project(). And huntergate() causes TOOLCHAIN to load. But if commands like find_library() are called before project() they will critically fail.

This project's cmake infrastructure doesn't provide a mechanism to load an arbitrary file after project() and before commands that might use libraries or their dependencies. This really is describing TOOLCHAIN, but...hunter.

This leaves the only fix to be....manually hacking depthai's CMakeLists.txt to have my needed cmake calls. Here is exactly what I have working today

...
# Set to export compile commands for tools like clang-tidy and format
set(CMAKE_EXPORT_COMPILE_COMMANDS ON)

#######################################################
set(OpenCV_STATIC ON) # because my opencv is a static opencv
if(CMAKE_BUILD_TYPE STREQUAL "Debug")
    set(OpenCV_DIR "C:/repos-nobackup/opencv/.install/Debug")
else()
    set(OpenCV_DIR "C:/repos-nobackup/opencv/.install/Release")
endif()

set(Eigen3_DIR "C:/njs/vcpkg/installed/x64-windows-static-md-v142-sdk10b17134/share/eigen3")
find_package(Eigen3 3.3.7 CONFIG REQUIRED)

list(APPEND CMAKE_MODULE_PATH "${CMAKE_SOURCE_DIR}/.vscode") # contains my custom FindIPP.cmake because there is no cmake standard find module for IPP
set(IPP_STATIC ON)
set(IPP_DIR "C:/Program Files (x86)/Intel/oneAPI/ipp/latest")
find_package(IPP 2021.1.0 REQUIRED) # COMPONENTS cc i iw)
#######################################################

# Add module path
...

I don't recommend you stop using Hunter. It is providing you substantial features you want.

Perhaps then, a fix could be to add a line of code to depthai's CMakeLists.txt. Where it optionally loads a cmake file if that file exists. And that file path is defined by a cmake define. Similar to the following. It would need to be after project() and before most other things, while at the same time allowing for this custom include to override most behavior. Looking at the code, I would recommend it just before the find_package(git).

include(${DEPTHAI_CUSTOM_INCLUDE} OPTIONAL) # need to test the case of that define being empty, might need to surround by if().
themarpe commented 3 years ago

@diablodale Are you compiling depthai as static or shared library. For static, depthai shouldn't need to know about Eingen and IPP - it only requires headers (if they are public), which should be available after find_package call with given OpenCV_DIR. Even for shared library, I think same thing applies (although I'm not sure if opencv provides find_package calls to its dependencies?)

After you compile and install depthai, you are given a depthaiConfig.cmake which describes the needed dependecies, among others, find_package for opencv. In you application, you'll be required to set OpenCV_DIR again, as that is the needed dependency.

If I got your description correctly, you would set a TOOLCHAIN, which specifies the find_package for Eigen and IPP?

One thing:

And huntergate() causes TOOLCHAIN to load. But if commands like find_library() are called before project() they will critically fail.

In my quick test, huntergate() doesn't seem to call toolchain: toolchain.cmake:

message(FATAL_ERROR "Toolchain failure")

And configuring depthai-core as usual:

cmake -S . -B build -D CMAKE_TOOLCHAIN_FILE=toolchain.cmake

Returns an error with stack trace originating from project() call. Not sure what kind of behavior have you observed though.

One idea that do comes to mind is instead of tweaking depthai-core is to create a complete OpenCVConfig.cmake:

include([path/to/opencv/install]/lib/cmake/opencv4/OpenCVConfig.cmake])
# Custom logic for finding 3rdparty requirements, Eigen, IPP

Then configure by setting this 'complete' OpenCVConfig.cmake instead of the installed one.

That way you give depthai-core everything it needs to compile OpenCV if it didn't have before.

The thing with having a custom include is that it just feels out of place, if opencv defines its dependencies, then setting the DIR to dependencies should work IMO. If it doesn't then not sure. I'm up to discussion on this one, but would like to see some usage of it (where it isn't just a workaround)

diablodale commented 3 years ago

Are you compiling depthai as static or shared library.

depthai as static library.

For static, depthai shouldn't need to know about Eingen and IPP - it only requires headers (if they are public), which should be available after find_package call with given OpenCV_DIR. Even for shared library, I think same thing applies (although I'm not sure if opencv provides find_package calls to its dependencies?)

That is errant. You are collapsing two distinctions together. The following two are completely distinct:

I recommend you do some reading on this topic. It is the responsibility of the cmake project to find its dependencies. This is usually done with toolchain files, set(xxx_DIR), find_package(), etc. And this extends down the rabbit hole. It is the responsibility of depthai to find ALL the dependencies of EVERYTHING it needs to compile/link. OpenCV correctly declares its dependencies (Eigen3:Eigen3 and IPP). Now it is the job of depthai cmake project to find those dependencies (Eigen and IPP) on the hard drive.

This is good and wanted behavior. The depthai project, or the dev compiling it (me), can specify exactly where dependencies are. OpenCV has no clue. OpenCV could have been downloaded from a ZIP file (very common). OpenCV doesn't know anyone's hard drive layout. It is the toolchain, set(), find_package(), etc. that does the mapping of dependency -> files on hard drive.

When a dependency is completely self-enclosed (like a DLL), then this 2nd level dependency issue isn't exposed. When the dependency is declared to have exactly the same as the name of the file that implements it and there is a system-wide directory of all libraries like /usr/lib...then this 2nd level dependency issue isn't exposed because cmake can find that exact filename in the system-wide /usr/lib directory using the well documented set of default prefixes/suffixes and paths that it checks when trying to find files/libs/includes/etc. All that silent default fallback help....doesn't work when dependencies are not single files with the exact same name, or there isn't a system-wide lib directory, etc. You will find in reading on this topic...declared dependencies are not files...they are symbolic names. And these symbolic names can map to single or multiple files. So a declared dependency like Eigen3:Eigen3 might actually be a list of 23 library files. And it is the job of depthai cmake project to find those 23 files.

Not sure what kind of behavior have you observed though.

Please reference what I wrote. I identified

  1. the exact error message: Error required internal CMake variable not set, cmake may not be built correctly. Missing variable is: CMAKE_FIND_LIBRARY_PREFIXES
  2. cause: find_library() before project()
  3. My TOOLCHAIN file is attempting to find IPP
  4. its related cause: huntergate() before project(). I'm not interested in debugging hunter and its misuse of TOOLCHAIN. Instead, I have a need to find dependencies and TOOLCHAIN is the place to do it. But since hunter misuses that, then I need another way to find dependencies.
  5. a workaround and a one line fix: include(mytoolchainfile.cmake OPTIONAL)

If there is some part of the above you don't understand, please ask me specifically and I'll try to make it more clear.

If I got your description correctly, you would set a TOOLCHAIN, which specifies the find_package for Eigen and IPP?

That's right. It is the responsibility of the depthai cmake project to find dependencies.

One idea that do comes to mind is instead of tweaking depthai-core is to create a complete OpenCVConfig.cmake:

That would never happen. As an almost-rule, xxxxConfig.cmake files are programmatically generated during config/compile/install of that project. These Config files live within the project installation itself. They are, in general, not hand made. A project like OpenCV is complex and I would never attempt to replicate the Config file OpenCV programmatically creates. Too complex. Too easy to make errors. And as a manpower evaluation...my one line proposed fix to the depthai cmakefiles.txt is faster, simple, and 100% works.

This is in contrast to Findxxx.cmake files. Find modules are not the same as Config. File modules are hand created. They are created for projects that are not cmake outputs. IPP is one such project. They do not provide cmake support and there is no Kitware distributed IPP finder.

Luxonis-Brandon commented 3 years ago

Hi @diablodale ,

I recommend you do some reading on this topic.

Do you have some recommended reading around this?

Thanks, Brandon

diablodale commented 3 years ago

haha, the plot thickens. 🧐 We might have confluence of issues. Please do review the below. I would like to reach a consensus on the problem and its cause. Afterwards, we can work on a solution. I'm including @alalek from the OpenCV project as he has direct knowledge and I'm starting to think OpenCV owns part of a solution.

My learning on this topic has been via cmake official docs, random articles found with searches, and discussions in context of projects (e.g. github issues). As I was doing a google search for suggested reading, I ran across https://cmake.org/cmake/help/git-stage/guide/importing-exporting/index.html

All required dependencies of a package must also be found in the package configuration file.

Which is the opposite of what I previously learned and wrote above in this issue. At first, my thought was...what does "found" mean? Does it mean only checking the xxx_FOUND variable? Does it mean actively searching for files on the hard drive? etc. So I did more reading at that URL and at https://cmake.org/cmake/help/latest/module/FindPackageHandleStandardArgs.html
https://cmake.org/cmake/help/git-stage/guide/using-dependencies/index.html
https://cmake.org/cmake/help/latest/command/find_package.html
https://cmake.org/cmake/help/latest/command/target_link_libraries.html
https://pabloariasal.github.io/2018/02/19/its-time-to-do-cmake-right/

Then I referenced a discussion in OpenCV https://github.com/opencv/opencv/issues/19282 and found also https://github.com/opencv/opencv/issues/18521. The first was a trigger for my viewpoint above. The latter I found today. Combining all that then made me question my previous views formed by the OpenCV issue. I can be wrong, and I prefer to learn and correct. 🔬

As a test, I traced the scenario that it is OpenCV's job to actively search on hard drives for dependencies it needs itself. With a shared library build (e.g. Windows DLL), most everything needed is found at compile-time of OpenCV itself and everything collected and linked together in one gigantic DLL. All code needed to run OpenCV is within that one DLL.

In contrast, a static library builds of OpenCV creates a Windows .LIB file. That .LIB file contains only the code of the OpenCV project itself. It does not contain the code of its dependencies (e.g. eigen, ceres, IPP, etc.). Generally for static libraries, the linker does not combine the code of the project with the code of the project's dependencies into one gigantic static LIB file. The linker, in downstream projects that use the static build of OpenCV, needs to link to the code for OpenCV's dependencies. And cmake provides mechanisms for this, e.g. target_link_libraries() with transitive dependencies.

When these dependencies are known by the cmake infrastructure of OpenCV, then somehow that knowledge needs to be given to downstream projects that link to OpenCV. The built-in technology to do this is https://cmake.org/cmake/help/git-stage/guide/importing-exporting/index.html#creating-a-package-configuration-file

# OpenCVConfig.cmake file should contain...
include(CMakeFindDependencyMacro)
find_dependency(Eigen)
find_dependency(IPP)
find_dependency(ceres)
...

I searched the entire OpenCV source code from the 4.5 branch. There is no code that calls find_dependency(). I also searched an install of OpenCV. It also does not have any call to find_dependency(). Referencing the official cmake docs above...suggests to me that it is OpenCV's job to find all its dependencies in its OpenCVConfig.cmake file. I don't think OpenCV is doing what it needs to be doing. This aligns with the issue https://github.com/opencv/opencv/issues/18521 and underlies issue https://github.com/opencv/opencv/issues/19282

Perhaps then a confluence of issues contribute. Please do check each of these for correctness.

  1. There is no cmake config file or cmake find module provided by the IPP (Intel Performance Primitives) team. 🤦‍♀️
  2. There is no cmake find module for IPP written by Kitware and shipped with cmake
  3. There is no cmake find module for IPP written by OpenCV and used internally within its own project
  4. OpenCV does not find its own dependencies in its OpenCVConfig.cmake as per official cmake instructions
  5. Because OpenCV does not follow cmake instructions, this cascades the burden to find all upstream dependencies of OpenCV to all downstream projects using OpenCV
  6. Somewhat circular, now downstream projects (like depthai) have the challenge to write their own custom IPP find module that is compatible with OpenCV (it uses very specific symbolic dependency names), and write their own toolchains to find Eigen, ceres, glog, cuda, and all other OpenCV upstream dependencies.

There may also be issues within depthai project or hunter, but these are later problems that might be sideaffects of the above six.

OK. Enough for today. What are all your viewpoints? 🙃

alalek commented 3 years ago

There is strong recommendation to use shared builds instead. Redistribution of "static builds" are over-complicated task for projects with external dependencies. Just find_dependency() approach may work until that provides the same results (you can play with find_dependency(LAPACK) on different user systems). Much worse case, this automatic approach may silently find a different version of binaries which is not compatible with used dependency headers in compiled binaries. You would get linker errors, or crashes in runtime later.

Unfortunately there is no robust easy-to-use solution for "static builds" problem. In OpenCV project we don't add anything to "static builds", so advanced users should configure dependencies yourself (with full responsibility) before calling find_package(OpenCV) of "static builds". Such configurations are specific for their use case only, build environment and used options/dependencies.

themarpe commented 3 years ago

I agree with @alalek on this as far as my limited insight in various different projects reach. Static dependencies are usually not handled transitively. (offtopic, in our project we do install to build directory by default and install the static dependencies that were used. This enables the option of linking to statically built depthai, and all dependencies will be resolved to ones that were installed alongside. This isn't quite doable in case when you want to install to a public location, with public/user controllable dependencies, but okay for isolated case like ours IMO)

Having find_dependencies (find_package) in *Config seems okay, but would have to be done only for static builds. Shared have dependencies already collapsed. On one hand it seems reasonable for a project to also specify the location of its dependencies it used for itself, but given that installation main point is to redistribute, it doesn't make much sense (maybe if those were packaged together). That issue is then similar to runtime question: every app self contained vs dependent on other libraries outside itself - but for build time.

@diablodale WRT HunterGate call, if you want you can create an MRE and I'll take a look, but its likely better to be addressed to Hunter project instead.

About the:

That would never happen. As an almost-rule, xxxxConfig.cmake files are programmatically generated during config/compile/install of that project. These Config files live within the project installation itself. They are, in general, not hand made. A project like OpenCV is complex and I would never attempt to replicate the Config file OpenCV programmatically creates. Too complex. Too easy to make errors. And as a manpower evaluation...my one line proposed fix to the depthai cmakefiles.txt is faster, simple, and 100% works.

I meant not to recreate OpenCVs Config file but to "extend" it with dependencies. That'd mean 1 include line and the rest are the same lines that you've posted above:

include(path/to/OpenCVConfig.cmake)

set(OpenCV_STATIC ON) # because my opencv is a static opencv
if(CMAKE_BUILD_TYPE STREQUAL "Debug")
    set(OpenCV_DIR "C:/repos-nobackup/opencv/.install/Debug")
else()
    set(OpenCV_DIR "C:/repos-nobackup/opencv/.install/Release")
endif()

set(Eigen3_DIR "C:/njs/vcpkg/installed/x64-windows-static-md-v142-sdk10b17134/share/eigen3")
find_package(Eigen3 3.3.7 CONFIG REQUIRED)

list(APPEND CMAKE_MODULE_PATH "${CMAKE_SOURCE_DIR}/.vscode") # contains my custom FindIPP.cmake because there is no cmake standard find module for IPP
set(IPP_STATIC ON)
set(IPP_DIR "C:/Program Files (x86)/Intel/oneAPI/ipp/latest")
find_package(IPP 2021.1.0 REQUIRED) # COMPONENTS cc i iw)

But toolchain way is probably better

diablodale commented 3 years ago

Please read the following in a neutral manner. I do not intend any insults. I am writing with an intention to detail my perspective and seek a solution that helps the most people. That is usually my goal. I created solutions for a billion people when I was a Microsoft employee. That experience causes me to have-no-fear, to solve hard problems, and solve problems for the most people.

Shifting the problem to someone else

My primary effort is developing apps for artists. I am the bridge between artists and complex technology. Yes, I can also dive deep into memory and threading bugs in libraries. 😉 I need my dependencies to be reliable and mange their own internals. I have no desire to dig into the internals of a dependency. It is not my job/role.

Unfortunately, I sometimes have to do the work that should be done by a dependency. I've done it for both OpenCV and DepthAi projects. (Remember, no blame/insults.) Going into the internals of another team is usually not a good thing. Me having to: 1) hack DepthAI's cmake files because, 2) DepthAi and OpenCV don't manage their own internal dependencies, because 3) OpenCV uses custom symbolic names for DLLs, and 4) OpenCV doesn't distribute the cmake files that create these symbolic names it uses itself.....ugh. No good here. 😝

All of that means I (app developer) have to review/correct the internals of OpenCV's cmake infrastructure, write my own cmake files to match OpenCV's internals, and hack DepthAI and/or Hunter. Only then can I do my job which is to write applications for artists. I'm smart, experiences, and I fixed all this myself. But it takes days to weeks of my time. Time is something that can never be replaced; the most valuable thing in the world. And this only fixes it for me; not the other 7.8 billion people in the world.

There is also another shift-the-problem in this discussion. I assert that shared libraries shift problems to end-users. End-users (like my artists) have problems when they don't have the exact shared libraries needed at runtime. The shared libraries are installed/replaced/removed by system upgrades, other app installs, PATH changes, and this is well-known on Windows as "DLL Hell". Everyone with experience has seen this. End-users usually can not fix this -- they do not have the knowledge/capability.

OpenCV example

Please remember...no insults. I am not debating the pro/cons of static-vs-shared libraries. Instead, I want to examine/inspect the fears that both OpenCV and DepthAi are expressing. And I need to use static/shared libraries as it is part of this fear.

@alalek, who are you protecting? And what you protecting them from? I request you clearly document this.✍

You write, "You would get linker errors, or crashes in runtime later." When libraries are linked together, the linker immediately reports ABI problems. That's how a linker works. This is not a problem and nothing to fear. When a compiled/linked app is tested...errors/crashes are the goal. This is not a problem and nothing to fear. This "linker error, crashes" you write happens with both static and shared libraries. It is not unique to one or the other.

When OpenCV is compiled both as a shared-library (DLL) or as a static library it continues to produce the "linker errors, crashes" problem that alalek writes. How? Because that's how linkers work. They report ABI problems when they link. Crashes later due to API problems are also in both shared and static -- because OpenCV.DLL and OpenCV.LIB do not have all code linked within them. At runtime, both static OpenCV.LIB or shared OpenCV.DLL themselves use shared-libraries (c/c++ runtimes, all Windows functionality, ffmpeg, intel media decoders, graphics drivers, onnxruntime, tbb, ipp, cuda, opencl runtime, etc.). None of the code I listed is contained within OpenCV.LIB or OpenCV.DLL. Both can have runtime crashes.💥

The following static and shared library examples have equal chance of success or failure. I see no reason for fear.

Distribution of static libraries is similar to distribution of shared libraries. Both are files on a hard drive. Both benefit from versioning. Both can be distributed with packages (linux rpm/deb, windows msi/exe, vcpkg, chocolately, etc.) and all those package tools have dependency management. All compilers and linkers have options to override default search paths for libraries and full path names to libraries can always be used. Both cmake find_dependency() and find_package() have versioning capability. If something specific is needed, then projects should specify it.

Is OpenCV suggesting that OpenCV has a problem when a random person on the Internet downloads OpenCV source code, compiles it, and posts a ZIP file of the OpenCV.LIB or OpenCV.DLL with no versioning, packaging, or cmake? Is this random developer the primary focus of the OpenCV team? Even if this was user#1, this random ZIP download of OpenCV.DLL or OpenCV.LIB both have 3rd party external dependency challenges and both require those dependencies be installed before a compile/link/run will succeed.

Libraries (both shared and static) provide their own ABI/API guarantees. Compiling with a different version LIB, or having a different shared library/DLL installed, are both challenges. Though, I believe the shared library/DLL is worse due to DLL Hell.🥵

My focus is on my app...not OpenCV. When I compile my app and link to static OpenCV, I am guaranteed to have that exact version of OpenCV because it is linked/contained within my app. I can test it. In contrast, if I link to shared-library OpenCV.DLL, I can test it but I can't easily control what version of OpenCV.DLL is found on the end-user's computer. Proof? 🔍 One of the most common complaints and problems with OpenCV on the internet is "Which version of the OpenCV DLL works with this app, 32/64-bit, os version" and "How do I install the OpenCV Dlls". Do a quick search with Google. End-users have shared-library "OpenCV.DLL Hell".

DepthAi example

Depthai, yesterday proved that the fear alalek writes applies to both shared and static libraries. I found a mutex bug in DepthAi where the mutex was not being locked. Correctly locking the mutex causes a chain reaction that exposes bugs in other code. The ABI and API are compatible before and after my mutex fix. However, if depthai was compiled as a shared-library, then all apps which linked to a shared DepthAi library would now randomly crash due to timing/race conditions. If DepthAi was compiled static, then the apps wouldn't crash because they would continue to use the older code that has the mutex bug that the app's code was lucky to not expose. You can easily reproduce this using DepthAi's own ctest suite of tests. Before fix, all pass. After fix, 1 to 3 will usually fail due to timing/race conditions.

Depthai, you will also have OpenCV.DLL hell. Some developer will use shared-library version=4.5.6 OpenCV.dll to build depthai-core.lib. This means anyone that links to that depthai-core.lib will need a version of OpenCV.dll that is 100% compatible with version 4.5.6. Then the same dev (or another dev) will write an app that uses both OpenCV and DepthAi apis, and compile/links to version=4.0.1 OpenCV.dll to build their app. Naturally, this means the app attempts to link to both depthai-core.lib and the OpenCV.dll library stubs. This is OpenCV.DLL hell because you can't have two different versions of OpenCV.dll linked/loaded in the same app. OpenCV might say "we are ABI/API compatible" but that is not true (because OpenCV had bugs, fixed bugs and added/changed features) and now the shared-library OpenCV.DLL exhibits the same problems alalek wrote above "linker errors, or crashes in runtime later".

Test as soon as possible

Is ABI and API compatibility something to fear? No. All projects that matter will test for ABI/API compatibility issues. And those ABI/API compatibility issues exist for BOTH shared and static libraries.

All projects that want reliability do testing. The remaining untested random projects need automation to manage internals/dependencies. These random projects need to be able to do a simple cmake --build. They should not do the 4-step reverse-engineering and hack that I describe above; and they might not have the capability to do it.

So what is this fear of static libraries? And who is being protected?🤔

Static libraries are tested at compile/link-time when developers and testers are actively examining and testing. My testers have problems. Good! That is their job! It keeps the problem with the testers, not my end-user artists. Shared libraries should also test at compile/link-time. But they can't test the end-user's computer because of OpenCV.DLL Hell.

Shifting Redux

From what I can discern, this aversion to static libraries is an attempt to shift the responsibility of solving problems from one's project to downstream projects. And in my scenario, each of you (OpenCV and DepthAi) have cascaded that to my project and the other dev that reported the above linked issue. I am confident we are not the only two in the world. OpenCV doesn't want to do it. DepthAi doesn't want to do it. So now I and other app devs have to do it. And we have to reverse-engineer the logic and internals of both OpenCV and DepthAI. 😞

Who are you both protecting?

We devs appreciate the functionality your libraries provide. Thank you. 🙂 For some like DepthAi we directly pay money. Others like OpenCV are indirectly paid through sponsorships from companies like Intel and we buy Intel hardware. There is compensation for the effort occurring.

Unfortunately, your projects are making it harder for us developers. Giving us vague protection for something we don't want. I know you are not doing this with malice. Both your teams have demonstrated a willingness to make things better. I suspect your projects' fears above are based in outdated rumors rather than current fact. Is this fear based on past/outdated use of static libraries before today's versioning, package control, and modern cmake? Or perhaps it is self-protection. Are you trying to protect your own dev/test resources so they don't have to update your cmake code? Or, are you trying to protect your support teams from something?

Need user scenarios

I request you write out a full user scenario. This is usually 2-3 paragraphs. Describe this "person" that you are protecting. Describe two specific problem scenarios of this person. The ten-word-fear that alalek writes above is too short and vague to be actionable or follow thinking/logic.

With clear user scenarios, we as a group can probably find a solution that works for us and the most people. That is my goal. Remember, I've already hacked a workaround for my own project. I want to fix it for the other 7.8 billion in the world.

themarpe commented 3 years ago

@diablodale please point me to modern CMake projects / examples properly doing transitive static dependencies. I'm wondering how are they are passing down these dependencies, how well it plays with other CMake projects, etc... I'd gladly take a look as I too had a bunch of static library problems in the past (although before modern CMake came to be the norm).

As of right now, they way our library handles static dependencies is by installing them alongside. So compiling DepthAI statically (and disabling optional opencv support, as we do not handle OpenCVs dependency in same manner as our internal ones) will install all needed dependencies along side of it, making a find_package(depthai) correctly resolve everything needed to compile it into your project. This works for shared compilation as well.

I'm not sure if our way is correct, pretty sure it wouldn't play well with installing things to system directories, but for starters that is the way I chose to do it.

diablodale commented 3 years ago

Hi I won't be able to point you to specific examples as I don't have the corpus of the Internet in my mind. ;-) In general, google search can be of help, the official cmake documentation kitware provides online, and the many links I provided above. While doing that linkfest, I could find nothing like "cmake does it all...oh but not static libraries...we don't do that". Instead, cmake provides clear interfaces to declare static, dynamic, public/interface/private, etc. which all allow (sub)projects and (sub)dependencies to declare their own needs/dependencies.

themarpe commented 3 years ago

I agree - I think CMake does offer everything needed to correctly define all dependencies, between libraries, etc... Although I think that the pain point is distribution of static libraries (shared libraries seem to be given more thought).

That said, regarding original issue, I'll try to find some time to play around with the toolchain issue. Any pointers on this or minimum example which isolates the issue? Otherwise will try to create a similar case locally

diablodale commented 3 years ago

The need for distribution of the (.a, .lib, .dll, etc) is present in both today's broken opencv approach and in an approach that I suggest. No change in that. It is no better or worse. Someone, somewhere has to save those files on the hard drive.

What I am proposing is that opencv declare what it needs using standard cmake declarations. Rather than I and other devs having to reverse-engineer opencv build/cmake to figure out the proprietary needs and symbolic/filenames used by opencv. Intel Performance Primitives (IPP) is one such example. OpenCV wrote their own private FindIPP.cmake, doesn't distribute it, and doesn't document their proprietary symbolic names of libraries. This is one of the issues I linked above and the one that initially mislead me away from the modern cmake method of "All required dependencies of a package must also be found in the package configuration file".

I exposed this issue trying to compile depthai static with opencv static on windows 10 64-bit. You might have more concrete failures compiling on other platforms. Because shared libs on Windows (aka DLLs) doesn't support the cmake flag CMAKE_POSITION_INDEPENDENT_CODE because Windows OS DLLs implement such idea in a completely different method than Linux. Its a noop on Windows.

Luxonis-Brandon commented 3 years ago

So we fund OpenCV open source software development with our sales of the OpenCV AI Kit. But we don't actually have say on things like these (and don't know who to talk to about this). I'd love to know who to talk to though about this. I'll ask Satya if he knows, but I'm not sure if he knows either.

Kevin-Delnoije commented 3 years ago

I stumbled on this issue because i was surprised that installation instructions did not provide a option to use a package manager for installing the dependencies. I am not a professional c++ developer nor a library maintainer, so take this advice with a grain of salt.

The two most popular c++ package managers are vcpkg and conan and both have good integrations with cmake so i suggest to make project file for both package managers with the required dependencies.

As for writing cmake files there are a lot of bad resources on the internet. One of the cmake maintainers wrote a excellent book which receives regular updates. On youtube there are cpp conference talks that discuss how to package a library i recommend two talks by a vcpkg maintainer that discusses the common pitfalls. 1 2

The sdk of the realsense library also has a option to install using vcpkg, So that might be a nice reference for implementing it here.

themarpe commented 3 years ago

Hi @Kevin-Delnoije Thanks for your thoughts on this. We haven't yet added any package manager support outside of Hunter (which we use as part of building the library to get the dependencies). Still having support for vcpkg & Conan, would make this play nicely with other dependencies also added to the project.

Do you primarily need the support for installing via a package manager as a means of integrating with a project or just to be able to easily install the library?