Open jktjkt opened 2 years ago
It seems Python bindings (libyang-python) support for the newer version 2 is "planned before the end of the year":
https://github.com/sysrepo/sysrepo-python/issues/12#issuecomment-905272635
In this case, wrapping the C++ bindings seems like unnecessary work.
But if I understand things correctly, this anyway doesn't solve the packaging/distribution problem since "pip install libyang" does not provide any precompiled binaries. You need GCC and development headers already installed on your system:
https://pypi.org/project/libyang/
Digging further it seems that the libyang python package did use to bundle libyang.so but stopped doing so because it was difficult to maintain:
https://github.com/CESNET/libyang-python/commit/fae31204b91f3d14ca1b9c85686016f5a4367065
My feeling is that this whole thing will be quite tricky for us in GNPy to handle but I don't have any alternative solution either...
But if I understand things correctly, this anyway doesn't solve the packaging/distribution problem since "pip install libyang" does not provide any precompiled binaries. You need GCC and development headers already installed on your system:
My plan was to bundle the libyang.so
inside the wheel and to maintain these bindings under the TIP umbrella. Not ideal, I know...
At least we can still play well with the Linux distro ecosystem by producing a source release which does not bundle anything (and which the distros/source-based-folks care about), and a binary wheel for convenience.
Would depending on libyang mean that GNPy cannot run natively on Windows anymore?
Originally I was hoping that we can keep native builds for Windows and Mac OS X because these are supported by cibuildwheel
. What libyang
needs is only libpcre2
, and a C compiler and CMake, so it will be "just" a matter of invoking cmake
(and somehow building PCRE2).
But I guess it's fair to ask for some re-evaluation of priorities here. We're already distributing Docker containers, so it's a question whether we still want to support "native installation" of Python code on various platforms using prebuilt binaries. If this was relaxed, it would be possible to base the Python code on libyang-cpp
, simplifying this job a lot (when it comes to working with YANG data trees in libyang
, the code is not that simple unfortunately).
What
libyang
needs is onlylibpcre2
, and a C compiler and CMake, so it will be "just" a matter of invokingcmake
(and somehow building PCRE2).
Yes but from some quick searching on the internet I can't find any indication of anyone managing to do that on Windows and this issue is still open:
If I understand correctly, the introduction of yang validation may imply that GNPy no more supports Windows and Mac OS in its future release (only linux). This means that users that usually develop on these OS, or application that use GNPy as an external library could be impacted. Am I correct ? Would this be an issue in Polito @AndreaDAmico ? or for the future GUI ?
Yes but from some quick searching on the internet I can't find any indication of anyone managing to do that on Windows and this issue is still open:
The majority of portability hurdles is in other layers of the stack that my colleagues at CESNET are producing (sysrepo
and netopeer2
the NETCONF server due to POSIX SHM and robust mutexes I guess), so this looks like a leftover ticket. But it's true I have not built this on Windows myself.
Honestly, I am more worried about properly understanding the usual ABI pitfalls of mingw cross-builds vs. MSVC, dealing with various versions of compiler and the debug vs. optimized ABI, etc. The last time I used MSVC was at the university, which was back in 2006, then I got some exposure through the Qt C++ framework, but I've been focusing exclusively on Linux ever since. How come that, say, the wheels for scipy
do not distinguish between various MSVC releases? Is my knowledge outdated? I don't know...
If I understand correctly, the introduction of yang validation may imply that GNPy no more supports Windows and Mac OS in its future release (only linux).
This is more complex than a yes/no. I'm saying that I'll have trouble producing "Python binary wheels" for all relevant versions of Windows and OSX (and ancient Linux as well). No matter what we do, the worst case is that installation on an "exotic platform" will require a C compiler, perhaps even a modern C++ compiler, cmake, etc -- but GNPy will run there.
I'm now exploring ways on how to make this simpler, but it seems that there's a tradeoff between:
or for the future GUI ?
The GUI that is currently subcontracted to Vayu will talk to an external GNPy service on the network, so this is not affected.
@EstherLerouzic , your workflow is a Ubuntu VM, right? Is it OK to assume that you can run on "latest and greatest" with little to no problem (like 21.10)?
@ojnas , what about you?
@AndreaDAmico , on what system are you developing GNPy?
Do we know someone who is using GNPy on Windows? What is their workflow? Do they use Anaconda?
What about OS X, @ggrammel I believe you're using that? How did you install gnpy
, where did you get your Python from, etc?
Regarding Windows, it seems that my memories were overly pessimistic. Scipy says that there's been "Universal C Runtime" since the release of Visual Studio 2015, which is good news because we can use a newer MSVC than the one which was used for building Python itself.
@EstherLerouzic , your workflow is a Ubuntu VM, right? Is it OK to assume that you can run on "latest and greatest" with little to no problem (like 21.10)?
Yes, Ubuntu VM or call of GNPy as a library also from Ubuntu VM I don't know all the releases of my colleagues (I currently develop on 20.04). Only one case of windows developper, but not an issue (can develop on linux also).
I can't find any indication of anyone managing to do that on Windows
Just a small update -- there's a bunch of POSIX assumptions which complicate the build a little bit:
ietf-inet-types
)unistd.h
, ssize_t
, etcI'm working on these; this is a mandatory step as far as I can tell anyway.
Would this be an issue in Polito @AndreaDAmico ?
@AndreaDAmico , on what system are you developing GNPy?
I use Linux. A couple of colleagues of mine use GNPy in Windows.
@EstherLerouzic @AndreaDAmico : For those on Windows, can you please ask them how they installed GNPy and Python? Did they use Anaconda? Did they download Python directly and used pip
? Or some other way?
I recall the following:
My most recent use of gnpy is actually using Jonas’ gnpy.app. Since I’m not a frequent docker user, the installation procedure I found easiest was via Anaconda.
A little add-on: the cli example in the Readme on https://github.com/Telecominfraproject/oopt-gnpy suggests that there would be a folder oopt-gnpy/examples/… but there isn’t. so I think the animated cli should be changed.
Gert
From: Jan Kundrát @.> Reply to: Telecominfraproject/oopt-gnpy @.> Date: Thursday, 9. December 2021 at 13:27 To: Telecominfraproject/oopt-gnpy @.> Cc: Gert Grammel @.>, Mention @.***> Subject: Re: [Telecominfraproject/oopt-gnpy] yang library packaging (Issue #426)
[External Email. Be cautious of content]
@EstherLerouzichttps://urldefense.com/v3/__https:/github.com/EstherLerouzic__;!!NEt6yMaO-gk!WgRzWjgfhV5_mkWqmNghTEvsC8G614dHQG8kZOI78CTn8OOje1CgnhxIrhMDyzV2$ @AndreaDAmicohttps://urldefense.com/v3/__https:/github.com/AndreaDAmico__;!!NEt6yMaO-gk!WgRzWjgfhV5_mkWqmNghTEvsC8G614dHQG8kZOI78CTn8OOje1CgnhxIrnXVTM1F$ : For those on Windows, can you please ask them how they installed GNPy and Python? Did they use Anaconda? Did they download Python directly and used pip? Or some other way?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHubhttps://urldefense.com/v3/__https:/github.com/Telecominfraproject/oopt-gnpy/issues/426*issuecomment-989806454__;Iw!!NEt6yMaO-gk!WgRzWjgfhV5_mkWqmNghTEvsC8G614dHQG8kZOI78CTn8OOje1CgnhxIriqKfF4N$, or unsubscribehttps://urldefense.com/v3/__https:/github.com/notifications/unsubscribe-auth/ADTQOVD6DMK7LITWCHY3VXLUQCOD3ANCNFSM5JQ2ENWQ__;!!NEt6yMaO-gk!WgRzWjgfhV5_mkWqmNghTEvsC8G614dHQG8kZOI78CTn8OOje1CgnhxIrmKeVsz2$. Triage notifications on the go with GitHub Mobile for iOShttps://urldefense.com/v3/__https:/apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675__;!!NEt6yMaO-gk!WgRzWjgfhV5_mkWqmNghTEvsC8G614dHQG8kZOI78CTn8OOje1CgnhxIriDYI04l$ or Androidhttps://urldefense.com/v3/__https:/play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign*3Dnotification-email*26utm_medium*3Demail*26utm_source*3Dgithub__;JSUlJSU!!NEt6yMaO-gk!WgRzWjgfhV5_mkWqmNghTEvsC8G614dHQG8kZOI78CTn8OOje1CgnhxIrmzsllG4$.
Juniper Business Use Only
Maybe another track that could be explored is to try and improve the performance of yangson. By a couple of simple optimizations I was able to shorten validation time on my computer from 32 seconds to 6 seconds for the Sweden OpenROADM topology and from 560 seconds to 32 seconds for a larger 50 ROADM node topology. This is maybe still too slow but it doesn't seem impossible to find further optimizations. Of course, I can't be sure that the optimizations does not have any unwanted side effects, even though they pass the upstream tests. If you want to try it for yourself you can clone my fork here:
That's... significant :), thanks. I would like to spend more time getting the C extension usable, but speeding up yangson
might be useful for some porting help (or perhaps as a temporary measure).
A small update: my patches which port libyang
to Windows have been merged upstream.
A small update: my patches which port
libyang
to Windows have been merged upstream.
That's a not so small achievement - good work!
We need a library that is capable of loading YANG data from JSON and validating that against a YANG model, supplying the default values, etc. We've determined that
yangson
has performance issues which are WONTFIX by upstream, so we need something else. It looks thatpyang
doesn't have yang data validation out-of-box, which means that we need to use something else.There's a C library for working with YANG, and there are Python bindings for its previous version. The C API is rather complex, there's a bunch of rules about lifetime management of various structs, which are neatly abstracted away by a C++ wrapper. Ideally, I would like to wrap the C++ bindings with
pybind11
because that's a technology stack that I'm familiar with.In Python, there are ways to distribute C or C++ extensions in a precompiled form. The most popular one is currently the
manylinux
approach which involves, essentially, taking an ancient Linux distribution (so as to not depend on "anything modern", using very old versions of all libraries instead), build the extension on this old system, and distribute the resulting binaries. The goal/hope here is to achieve compatibility with "everything". There are scripts likecibuildwheel
which make it easy to integrate automatic building into CI/CD, including automatic pushing of resulting binaries for variety of architectures. Unfortunately, becauselibyang-cpp
actively uses the modern C++20 standard, building currently requires a new enough GCC or clang (upstream is building on GCC 11, clang 12, etc). This is very tricky to do on ancient Linux distros because it is not "just the compiler", it's about the runtime libraries, the rest of the toolchain, etc.One possible approach would be to build the
libyang
andlibyang-cpp
statically. That would involve "baking in" thelibpcre2
and parts of theglibc
, and based upon my preliminary checks, that "sounds possible". A downside of this approach is that we would be bundling stuff, and that one cannot benefit from the existing scripting/CI infrastructure for automatic wheel building for various architectures. As a practical result, it would be hard to build an extension for "Linux/ARM", for example, or even for Windows or for a Mac. As a result, GNPy would be hard to install on these platforms because you would essentially need a modern C++ compiler.Another approach is to try to refresh the older Python bindings for the C library, and wrap everything again with CFFI. This means that we would have to duplicate all the work which has been already done for the C++ bindings once again for Python. The upside is that we won't need to reinvent the build scripts, and it looks that someone is already trying to do this. We could still bundle the
libyang.so.2
in the resulting wheel, and perhapslibpcre2
as well.