matthew-brett / delocate

Find and copy needed dynamic libraries into python wheels
BSD 2-Clause "Simplified" License
262 stars 59 forks source link

Update `parse_install_name` to the latest format. #130

Closed HexDecimal closed 2 years ago

HexDecimal commented 2 years ago

It's likely that otool changed its output since #78 and parse_install_name needs to be updated again.

So far this changes the function so that it outputs the input string when it fails, which will make this easier to fix in the future. I also added tests for macOS 11 and Xcode 13 but they didn't fail.

Refactors how otool install names are parsed which fixes #78.

matthew-brett commented 2 years ago

Looks good to me. Please do merge when happy.

codecov-commenter commented 2 years ago

Codecov Report

Merging #130 (7435a03) into master (9c2c6a4) will increase coverage by 0.27%. The diff coverage is 97.75%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master     #130      +/-   ##
==========================================
+ Coverage   95.28%   95.56%   +0.27%     
==========================================
  Files          13       13              
  Lines        1018     1059      +41     
==========================================
+ Hits          970     1012      +42     
+ Misses         48       47       -1     
Impacted Files Coverage Δ
delocate/tools.py 96.07% <97.75%> (+1.21%) :arrow_up:

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 9c2c6a4...7435a03. Read the comment docs.

HexDecimal commented 2 years ago

I think I've finished this PR. It can parse when multiple architectures of a library are listed by otool,

All the uses of otool are converted to use the more generic functions. The old functions can't handle when different architectures return different values so they'll error whenever that happens. I'll save that kind of thing for a later issue, this was just to fix the parsing.

Since the current tools don't give the output being tested, I've made new tests with the old and new expected otool stdout as strings. A nice thing is that I'm able to test these from non-macOS platforms.

@papr might want to review this. Hopefully his dependencies are simple enough that this PR will resolve them in its current state.

matthew-brett commented 2 years ago

Yes - but my memory was that the original tests deliberately moved out of the source directory in order to catch the installed package. Maybe I forgot to do that, this time.

HexDecimal commented 2 years ago

Yes - but my memory was that the original tests deliberately moved out of the source directory in order to catch the installed package. Maybe I forgot to do that, this time.

The GitHub workflows have their own isolated tests, which I know test the installed package instead of any local files because they only have the wheel to work with.

HexDecimal commented 2 years ago

I've replaced the asserts you mentioned with exceptions. There isn't a real way to make an input that covers them, so maybe I should have defended keeping asserts more.

matthew-brett commented 2 years ago

How does keeping the asserts help in the lack of coverage?

HexDecimal commented 2 years ago

How does keeping the asserts help in the lack of coverage?

For example a library can't have more than one install ID (per-architecture now,) so you can't create a library which covers the test that more than one value exists. It's just a minor issue that exists with some kinds of error checks.

matthew-brett commented 2 years ago

Yes - I do understand that it is difficult to test - but it seems to me that's unrelated to the desirability or otherwise of asserts. Or did I miss something?

HexDecimal commented 2 years ago

I don't think you missed anything. Other than maybe that's it's actually pretty uncommon to put Python into optimize mode for scripts. Asserts being enabled is typically the default.

matthew-brett commented 2 years ago

As ever - thanks for working so quickly on these issues.