deepmodeling / deepmd-kit

A deep learning package for many-body potential energy representation and molecular dynamics
https://docs.deepmodeling.com/projects/deepmd/
GNU Lesser General Public License v3.0
1.41k stars 486 forks source link

feat(pt): add more information to summary and error message of loading library #3895

Closed njzjz closed 1 week ago

njzjz commented 1 week ago

Most are copied from TF for consistent user experience.

The summary is like

[2024-06-22 06:19:31,090] DEEPMD INFO    --------------------------------------------------------------------------------------------------------------
[2024-06-22 06:19:31,090] DEEPMD INFO    installed to:          /home/jz748/codes/deepmd-kit/deepmd
[2024-06-22 06:19:31,090] DEEPMD INFO                           /home/jz748/anaconda3/lib/python3.10/site-packages/deepmd
[2024-06-22 06:19:31,090] DEEPMD INFO    source:                v3.0.0a0-229-g6d2c6095
[2024-06-22 06:19:31,090] DEEPMD INFO    source brach:          pt-add-more-info
[2024-06-22 06:19:31,090] DEEPMD INFO    source commit:         6d2c6095
[2024-06-22 06:19:31,090] DEEPMD INFO    source commit at:      2024-06-22 06:16:55 -0400
[2024-06-22 06:19:31,090] DEEPMD INFO    use float prec:        double
[2024-06-22 06:19:31,090] DEEPMD INFO    build variant:         cuda
[2024-06-22 06:19:31,090] DEEPMD INFO    Backend:               PyTorch
[2024-06-22 06:19:31,090] DEEPMD INFO    PT ver:                v2.1.2.post300-ge32f208075b
[2024-06-22 06:19:31,090] DEEPMD INFO    Enable custom OP:      True
[2024-06-22 06:19:31,090] DEEPMD INFO    build with PT ver:     2.1.2
[2024-06-22 06:19:31,090] DEEPMD INFO    build with PT inc:     /home/jz748/anaconda3/lib/python3.10/site-packages/torch/include
[2024-06-22 06:19:31,090] DEEPMD INFO                           /home/jz748/anaconda3/lib/python3.10/site-packages/torch/include/torch/csrc/api/include
[2024-06-22 06:19:31,090] DEEPMD INFO    build with PT lib:     /home/jz748/anaconda3/lib/python3.10/site-packages/torch/lib
[2024-06-22 06:19:31,090] DEEPMD INFO    running on:            localhost.localdomain
[2024-06-22 06:19:31,090] DEEPMD INFO    computing device:      cuda:0
[2024-06-22 06:19:31,090] DEEPMD INFO    CUDA_VISIBLE_DEVICES:  unset
[2024-06-22 06:19:31,090] DEEPMD INFO    Count of visible GPUs: 2
[2024-06-22 06:19:31,090] DEEPMD INFO    num_intra_threads:     0
[2024-06-22 06:19:31,091] DEEPMD INFO    num_inter_threads:     0
[2024-06-22 06:19:31,091] DEEPMD INFO    --------------------------------------------------------------------------------------------------------------

The error message is like

deepmd/pt/cxx_op.py:39: in load_library
    torch.ops.load_library(module_file)
../../anaconda3/lib/python3.10/site-packages/torch/_ops.py:852: in load_library
    ctypes.CDLL(path)
../../anaconda3/lib/python3.10/ctypes/__init__.py:374: in __init__
    self._handle = _dlopen(self._name, mode)
E   OSError: /home/jz748/anaconda3/lib/python3.10/site-packages/deepmd/lib/libdeepmd_op_pt.so: undefined symbol: _ZNK5torch8autograd4Node4nameEv

The above exception was the direct cause of the following exception:
source/tests/pt/test_LKF.py:9: in <module>
    from deepmd.pt.entrypoints.main import (
deepmd/pt/__init__.py:4: in <module>
    from deepmd.pt.cxx_op import (
deepmd/pt/cxx_op.py:95: in <module>
    ENABLE_CUSTOMIZED_OP = load_library("deepmd_op_pt")
deepmd/pt/cxx_op.py:51: in load_library
    raise RuntimeError(
E   RuntimeError: This deepmd-kit package was compiled with CXX11_ABI_FLAG=0, but PyTorch runtime was compiled with CXX11_ABI_FLAG=1. These two library ABIs are incompatible and thus an error is raised when loading deepmd_op_pt. You need to rebuild deepmd-kit against this TensorFlow runtime.

Summary by CodeRabbit

coderabbitai[bot] commented 1 week ago

[!WARNING]

Rate limit exceeded

@njzjz has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 16 minutes and 33 seconds before requesting another review.

How to resolve this issue? After the wait time has elapsed, a review can be triggered using the `@coderabbitai review` command as a PR comment. Alternatively, push new commits to this PR. We recommend that you space out your commits to avoid hitting the rate limit.
How do rate limits work? CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our [FAQ](https://coderabbit.ai/docs/faq) for further information.
Commits Files that changed from the base of the PR and between 556d7e4329039beabb8ccdd855b9b1a2e2222f29 and 3a703aa7ad5c0d6828b6c8de26a77dcc00387a16.

Walkthrough

The new changes introduce improved robustness and compatibility checks for DeepMD-py wrapper based on PyTorch. These enhancements revolve around exception handling when loading shared libraries while ensuring alignment with the CXX11 ABI flag and version compatibility. Additionally, the entry points have been enhanced to provide extra backend information when custom operations are enabled.

Changes

File Changes Summary
deepmd/pt/cxx_op.py Added exception handling for torch.ops.load_library, ensuring CXX11 ABI flag and version compatibility, and raised errors for mismatches.
deepmd/pt/entrypoints/main.py Imported GLOBAL_CONFIG and modified get_backend_info to conditionally include more info if ENABLE_CUSTOMIZED_OP is set.

Sequence Diagram(s)

sequenceDiagram
    participant User
    participant Main as main.py
    participant PT as cxx_op.py
    participant PyTorch
    participant Config as deepmd.env

    User->>Main: Start application
    Main->>Config: Get GLOBAL_CONFIG
    Main->>PT: Load CXX operations
    PT->>PyTorch: torch.ops.load_library(module_file)
    PyTorch-->>PT: Check CXX11 ABI and version compatibility
    PT->>PT: Exception Handling and Error Raising
    Main->>Main: get_backend_info()
    Main-->>User: Provide backend info if ENABLE_CUSTOMIZED_OP is set

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share - [X](https://twitter.com/intent/tweet?text=I%20just%20used%20%40coderabbitai%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20the%20proprietary%20code.%20Check%20it%20out%3A&url=https%3A//coderabbit.ai) - [Mastodon](https://mastodon.social/share?text=I%20just%20used%20%40coderabbitai%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20the%20proprietary%20code.%20Check%20it%20out%3A%20https%3A%2F%2Fcoderabbit.ai) - [Reddit](https://www.reddit.com/submit?title=Great%20tool%20for%20code%20review%20-%20CodeRabbit&text=I%20just%20used%20CodeRabbit%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20proprietary%20code.%20Check%20it%20out%3A%20https%3A//coderabbit.ai) - [LinkedIn](https://www.linkedin.com/sharing/share-offsite/?url=https%3A%2F%2Fcoderabbit.ai&mini=true&title=Great%20tool%20for%20code%20review%20-%20CodeRabbit&summary=I%20just%20used%20CodeRabbit%20for%20my%20code%20review%2C%20and%20it%27s%20fantastic%21%20It%27s%20free%20for%20OSS%20and%20offers%20a%20free%20trial%20for%20proprietary%20code)
Tips ### Chat There are 3 ways to chat with [CodeRabbit](https://coderabbit.ai): - Review comments: Directly reply to a review comment made by CodeRabbit. Example: - `I pushed a fix in commit .` - `Generate unit testing code for this file.` - `Open a follow-up GitHub issue for this discussion.` - Files and specific lines of code (under the "Files changed" tab): Tag `@coderabbitai` in a new review comment at the desired location with your query. Examples: - `@coderabbitai generate unit testing code for this file.` - `@coderabbitai modularize this function.` - PR comments: Tag `@coderabbitai` in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples: - `@coderabbitai generate interesting stats about this repository and render them as a table.` - `@coderabbitai show all the console.log statements in this repository.` - `@coderabbitai read src/utils.ts and generate unit testing code.` - `@coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.` - `@coderabbitai help me debug CodeRabbit configuration file.` Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. ### CodeRabbit Commands (invoked as PR comments) - `@coderabbitai pause` to pause the reviews on a PR. - `@coderabbitai resume` to resume the paused reviews. - `@coderabbitai review` to trigger an incremental review. This is useful when automatic reviews are disabled for the repository. - `@coderabbitai full review` to do a full review from scratch and review all the files again. - `@coderabbitai summary` to regenerate the summary of the PR. - `@coderabbitai resolve` resolve all the CodeRabbit review comments. - `@coderabbitai configuration` to show the current CodeRabbit configuration for the repository. - `@coderabbitai help` to get help. Additionally, you can add `@coderabbitai ignore` anywhere in the PR description to prevent this PR from being reviewed. ### CodeRabbit Configration File (`.coderabbit.yaml`) - You can programmatically configure CodeRabbit by adding a `.coderabbit.yaml` file to the root of your repository. - Please see the [configuration documentation](https://docs.coderabbit.ai/guides/configure-coderabbit) for more information. - If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: `# yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json` ### Documentation and Community - Visit our [Documentation](https://coderabbit.ai/docs) for detailed information on how to use CodeRabbit. - Join our [Discord Community](https://discord.com/invite/GsXnASn26c) to get help, request features, and share feedback. - Follow us on [X/Twitter](https://twitter.com/coderabbitai) for updates and announcements.
codecov[bot] commented 1 week ago

Codecov Report

Attention: Patch coverage is 30.00000% with 14 lines in your changes missing coverage. Please review.

Project coverage is 82.72%. Comparing base (b6f0fa3) to head (3a703aa).

Files Patch % Lines
deepmd/pt/cxx_op.py 18.75% 13 Missing :warning:
deepmd/pt/entrypoints/main.py 75.00% 1 Missing :warning:
Additional details and impacted files ```diff @@ Coverage Diff @@ ## devel #3895 +/- ## ========================================== - Coverage 82.74% 82.72% -0.03% ========================================== Files 519 519 Lines 50491 50510 +19 Branches 3015 3015 ========================================== + Hits 41781 41786 +5 - Misses 7773 7787 +14 Partials 937 937 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.