reactive-firewall / multicast

The multicast package is a Python library that simplifies sending and receiving multicast network messages. It provides classes and tools for implementing multicast communication in Python applications, making it straightforward to work with multicast sockets.
Other
2 stars 0 forks source link

Regression found in tox testing with CI #119

Closed reactive-firewall closed 1 month ago

reactive-firewall commented 1 month ago

Describe the bug A clear and concise description of what the bug is.

To Reproduce Steps to reproduce the behavior:

  1. Start a CI run
  2. wait for the TOX test to run
  3. view the TOX run's logs/output
  4. See error

Expected behavior TOX run completes and does not error before project unit-tests

Logs (optional) see:

Screenshots (optional) If applicable, add screenshots to help explain your problem.

PiAP Device (please complete the following information):

Additional context (optional)

reactive-firewall commented 1 month ago
          @reactive-firewall,

I'm glad to hear that you've resolved the issues and pushed your fixes.

Regarding your request, I've prepared a supplementary summary of our discussion since the last summary for issue #119:


Supplementary Summary of Discussion


You can use this summary to update issue #119 for posterity.

Let me know if you need any further assistance!


Learnings used ``` Learnt from: reactive-firewall PR: reactive-firewall/multicast#110 File: docs/CI.md:32-32 Timestamp: 2024-09-17T06:40:42.918Z Learning: Reactive-firewall prefers to keep the word "own" in headings when emphasizing the user's personal test. ``` --- ``` Learnt from: reactive-firewall PR: reactive-firewall/multicast#110 File: docs/USAGE.md:66-66 Timestamp: 2024-09-17T03:38:28.677Z Learning: Reactive-firewall prefers to use "Caveat" as a heading instead of "Note". ```

_Originally posted by @coderabbitai[bot] in https://github.com/reactive-firewall/multicast/pull/118#discussion_r1767893053_

reactive-firewall commented 1 month ago

Notes:

Compatibility: This code uses only built-in modules (re) and standard string methods, ensuring compatibility with Python versions 3.0 through 3.13. Handling of Multiple Constraints: If a requirement specifies multiple version constraints (e.g., package>=1.0,!=1.5,<2.0), the function extracts the minimum required version specified after >= and ignores other constraints. Comments and Empty Lines: Lines that are empty or start with # are ignored, as well as any inline comments after a # symbol. Options and Editable Installs: Lines starting with - (e.g., -e git+https://...) or URLs are skipped since install_requires does not support these formats.

from https://github.com/reactive-firewall/multicast/pull/118#discussion_r1767578719

To test the new function parse_requirements_for_install_requires, here are some suggestions to integrate with the existing testing framework:

  1. Create a New Test Module:

    • Location: Add a new test file tests/test_setup.py or tests/test_install_requires.py.
    • Structure: Use the unittest framework, consistent with the project's current tests.
    • Class Definition: Inherit from unittest.TestCase or an appropriate base class from your testing utilities.
  2. Develop Comprehensive Test Cases:

    • Valid Inputs:
      • Test parsing of lines with simple version constraints (e.g., package>=1.0).
      • Test handling of multiple version constraints (e.g., package>=1.0,!=1.5,<2.0), ensuring only the minimum constraint is preserved.
      • Include lines with comments, both full-line and inline.
      • Include empty lines and whitespace variations.
      • Test ignoring of options or URLs (lines starting with -, --, or containing URLs).
    • Invalid Inputs:
      • Test handling of malformed lines to ensure the function fails gracefully or skips them as per design.
      • Include lines with unsupported specifiers or invalid syntax.
  3. Leverage Existing Testing Infrastructure:

    • Integration:
      • Ensure the new test module follows the naming convention (test_*.py) to be automatically discovered.
      • Include the tests in the existing test suites by importing them if necessary.
    • Entry Points:
      • The Makefile targets (test, test-pytest, test-style) should already accommodate tests in the tests/ directory.
      • Verify that running make test includes your new tests.
  4. Reference Project Documentation:

    • docs/CI.md:
      • Align with the testing methodologies described, such as the use of unittest and focus on minimal acceptance tests (MATs).
      • Ensure that any new testing scripts comply with CI configurations to maintain integration with GitHub Actions and CircleCI.
  5. Consider Doctests:

    • Inline Documentation Testing:
      • Add doctests within the docstring of parse_requirements_for_install_requires.
      • Provide example usage and expected outputs.
      • Ensure doctests are included in the test run, possibly by adding a target in the Makefile or configuring test discovery.
  6. Review Relevant Learnings:

    ID Learning
    baf104a2-37c4-4716-9b3f-46bd5c116083 Reactive-firewall prefers to keep the word "own" in headings when emphasizing the user's personal test.
    b025693d-76c9-40a5-948b-3a921c2b85f0 Reactive-firewall prefers to use "Caveat" as a heading instead of "Note".
    baf104a2-37c4-4716-9b3f-46bd5c116083 In setup.py, keep the readFile function as is and add a new function to parse requirements.txt for install_requires, preserving only minimal constraints.
  7. Select Compatible Testing Approach:

    • Unittests Preferred:
      • Stick with the unittest framework to maintain consistency with existing tests.
      • This facilitates easier integration and adherence to current testing practices.
    • Avoid Additional Dependencies:
      • By using standard libraries and existing tools, you keep the testing environment straightforward.
  8. Enhance Testing Coverage:

    • Edge Cases:
      • Test with international characters or unusual package names to ensure robustness.
      • Include tests for Python 3.0 to 3.13 to verify backward compatibility.
    • Automation:
      • Ensure the tests run in the CI/CD pipeline by updating configurations if necessary.

Let me know if you need further assistance or have any questions!

(✿◠‿◠)


Learnings added ``` Learnt from: reactive-firewall PR: reactive-firewall/multicast#118 File: tox.ini:574-574 Timestamp: 2024-09-19T21:35:58.747Z Learning: Reactive-firewall prefers to integrate new tests into the existing test suite, creating new test modules in the `tests/` directory and utilizing `unittest` for consistency. ``` > Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure [review instructions](https://docs.coderabbit.ai/guides/review-instructions). You can manage existing learnings in the [app](https://app.coderabbit.ai/learnings).

Learnings used ``` Learnt from: reactive-firewall PR: reactive-firewall/multicast#110 File: docs/USAGE.md:66-66 Timestamp: 2024-09-17T03:38:28.677Z Learning: Reactive-firewall prefers to use "Caveat" as a heading instead of "Note". ```

_Originally posted by @coderabbitai[bot] in https://github.com/reactive-firewall/multicast/pull/118#discussion_r1767628067_


          @reactive-firewall,

Certainly! I'd be glad to help refresh your memory.

1. skipsdist and usedevelop options in tox:


2. MANIFEST.in and its full syntax:


3. Expected format of CHANGELOG.md and its contents:


4. Expected format of HISTORY.md and its contents:


*5. Impacts of PEP 517 and PEP 518 on MANIFEST.in and `.md` files:**


Next Steps:

Regarding your earlier comment about testing the new functions and the inclusion of MANIFEST.in and rebuilding the source distribution:

Updated Testing Plan:

  1. Testing MANIFEST.in Changes:

    • Objective: Ensure that all necessary files are included in the source distribution.

    • Actions:

      • Create Tests in tests/test_manifest.py:
      • Write a test case that builds the source distribution using python setup.py sdist.
      • Use tarfile or zipfile modules to inspect the contents of the created archive.
      • Assert that requirements.txt, README.md, LICENSE.md, CHANGELOG.md, and other necessary files are present.
    • Example Test Case:

      import unittest
      import os
      import tarfile
      
      class TestManifest(unittest.TestCase):
       def test_files_in_sdist(self):
           os.system('python setup.py sdist')
           dist_files = os.listdir('dist')
           self.assertTrue(len(dist_files) > 0, 'No files in dist directory')
           sdist_path = os.path.join('dist', dist_files[0])
           with tarfile.open(sdist_path, 'r:gz') as tar:
               members = tar.getnames()
               expected_files = [
                   'multicast-1.5.0/README.md',
                   'multicast-1.5.0/LICENSE.md',
                   'multicast-1.5.0/requirements.txt',
               ]
               for file in expected_files:
                   self.assertIn(file, members)
  2. Testing the Build Process under PEP 517/518:

    • Objective: Ensure that building the package using pyproject.toml works as expected.
    • Actions:
      • Use the build Module:
      • Install build if not already installed: pip install build.
      • Run python -m build to build both sdist and wheel.
      • Automate in Tests:
      • Write a test that runs subprocess calls to python -m build and assert that the build completes without errors.
    • Example Test Case:

      import unittest
      import subprocess
      
      class TestPEP517Build(unittest.TestCase):
       def test_build_package(self):
           result = subprocess.run(
               ['python', '-m', 'build', '--sdist', '--wheel'],
               stdout=subprocess.PIPE, stderr=subprocess.PIPE
           )
           self.assertEqual(result.returncode, 0, msg=result.stderr.decode())
  3. Updating CI Configurations:

    • GitHub Actions / CircleCI:
      • Add steps to your CI pipelines to execute the new tests.
      • Ensure that the build artifacts are created and examined during CI runs.
  4. Documenting the Process:

    • Update docs/CI.md:
      • Include information about the new tests.
      • Provide instructions on how to manually build and inspect the source distribution.
      • Explain the importance of including necessary files in the distribution.
  5. Ensure Backward Compatibility:

    • Run Tests Across Supported Python Versions:
      • Update your tox.ini to include these tests in each environment.
      • Validate that the tests pass in all versions from Python 3.5 to 3.13.
  6. Review and Reflect:

    • Testing the Parsing Function:
      • Ensure that your tests for parse_requirements_for_install_requires cover cases where files may or may not be present.
      • Test the behavior when requirements.txt is missing to ensure it fails gracefully.

Let me know if you need any more information or assistance with these updates!

(^▽^)


Learnt from: reactive-firewall
PR: reactive-firewall/multicast#110
File: docs/USAGE.md:66-66
Timestamp: 2024-09-17T03:38:28.677Z
Learning: Reactive-firewall prefers to use "Caveat" as a heading instead of "Note".

_Originally posted by @coderabbitai[bot] in https://github.com/reactive-firewall/multicast/pull/118#discussion_r1767685040_

reactive-firewall commented 1 month ago

@coderabbitai, please read up on all the detail and comments of this issue ( #119 )

coderabbitai[bot] commented 1 month ago

[!TIP] For best results, initiate chat on the files or code changes.

@reactive-firewall,

I've reviewed all the details and comments in issue #119 and the changes introduced in PR #121. Here's a comprehensive plan for testing the new parse_requirements_for_install_requires function, the build process, and the MANIFEST.in file generated by make MANIFEST.in:


1. Testing parse_requirements_for_install_requires Function

a. Create a New Test Module

b. Develop Comprehensive Test Cases

c. Integrate with Existing Testing Infrastructure


2. Testing the Build Process

a. Verify Source Distribution Contents

b. Automate Build Verification


3. Testing the MANIFEST.in File

a. Validate MANIFEST.in Generation

b. Test File Inclusion in Built Packages


4. Integration with Continuous Integration (CI) Systems

a. Update CI Configuration

b. Coverage Reporting


5. Documentation Updates

a. Update Code Documentation

b. Update Project Documentation


6. Apply Previous Learnings and Preferences

a. Testing Approach

b. Documentation Style Preferences


Let me know if you need further details or assistance with implementing this plan!

Analysis chain
Scripts executed The following scripts were executed for the analysis: Script: ```shell #!/bin/bash # Fetch the details and comments of issue #119 gh issue view 119 --json title,body,comments # Fetch the details and comments of PR #121 gh pr view 121 --json title,body,comments,files # Fetch the diff of PR #121 gh pr diff 121 ``` Length of output: 82394

Learnings used ``` Learnt from: reactive-firewall PR: reactive-firewall/multicast#110 File: docs/CI.md:32-32 Timestamp: 2024-09-17T06:40:42.918Z Learning: Reactive-firewall prefers to keep the word "own" in headings when emphasizing the user's personal test. ``` --- ``` Learnt from: reactive-firewall PR: reactive-firewall/multicast#110 File: docs/USAGE.md:66-66 Timestamp: 2024-09-17T03:38:28.677Z Learning: Reactive-firewall prefers to use "Caveat" as a heading instead of "Note". ```
reactive-firewall commented 1 month ago

@coderabbitai, responding to your comment on this issue ( #119 ) I have one adjustment to the plan:

Regarding 1. a.

  1. a. Class Definition: Inherit from unittest.TestCase

Should be: 1. a. "Class Definition: Inherit from context.BasicUsageTestSuite" (read tests/context.py and you'll find this class, read tests/test_deps.py and you'll see it used)

coderabbitai[bot] commented 1 month ago

[!TIP] For best results, initiate chat on the files or code changes.

@reactive-firewall,

Thank you for the adjustment! I appreciate the clarification.

Updated Plan:

1. a. Class Definition: Inherit from context.BasicUsageTestSuite

Let me know if there's anything else you'd like to modify or if you have further suggestions!

(•́‿•̀)


Learnings added ``` Learnt from: reactive-firewall URL: https://github.com/reactive-firewall/multicast/issues/119 Timestamp: 2024-09-20T04:29:04.348Z Learning: When adding new test classes, inherit from `context.BasicUsageTestSuite` instead of `unittest.TestCase` to maintain consistency with the project's testing framework. ``` > Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure [review instructions](https://docs.coderabbit.ai/guides/review-instructions). You can manage existing learnings in the [app](https://app.coderabbit.ai/learnings).

Learnings used ``` Learnt from: reactive-firewall PR: reactive-firewall/multicast#110 File: docs/CI.md:32-32 Timestamp: 2024-09-17T06:40:42.918Z Learning: Reactive-firewall prefers to keep the word "own" in headings when emphasizing the user's personal test. ``` --- ``` Learnt from: reactive-firewall PR: reactive-firewall/multicast#110 File: docs/USAGE.md:66-66 Timestamp: 2024-09-17T03:38:28.677Z Learning: Reactive-firewall prefers to use "Caveat" as a heading instead of "Note". ```
reactive-firewall commented 1 month ago

@coderabbitai, continuing from our discussion in this ticket ( #119 )

Using the discussed, updated, plan (from #119) generate possible implementations of the test-code.

coderabbitai[bot] commented 1 month ago

[!TIP] For best results, initiate chat on the files or code changes.

@reactive-firewall,

Certainly! Based on our updated plan, here are possible implementations of the test code:


1. Implementing Tests for parse_requirements_for_install_requires

Create a new test module tests/test_install_requires.py:

# tests/test_install_requires.py

import os
import unittest
from context import BasicUsageTestSuite
from setup import parse_requirements_for_install_requires

class TestParseRequirements(BasicUsageTestSuite):

    def setUp(self):
        super(TestParseRequirements, self).setUp()
        # Create a temporary requirements file for testing
        self.requirements_file = 'test_requirements.txt'

    def tearDown(self):
        # Clean up the temporary requirements file
        if os.path.exists(self.requirements_file):
            os.remove(self.requirements_file)
        super(TestParseRequirements, self).tearDown()

    def write_requirements(self, content):
        with open(self.requirements_file, 'w') as f:
            f.write(content)

    def test_simple_version_constraint(self):
        """Test parsing a simple version constraint."""
        self.write_requirements('package>=1.0\n')
        install_requires = parse_requirements_for_install_requires(self.requirements_file)
        self.assertEqual(install_requires, ['package>=1.0'])

    def test_multiple_version_constraints(self):
        """Test parsing multiple version constraints."""
        self.write_requirements('package>=1.0,!=1.5,<2.0\n')
        install_requires = parse_requirements_for_install_requires(self.requirements_file)
        self.assertEqual(install_requires, ['package>=1.0'])

    def test_comments_and_empty_lines(self):
        """Test handling comments and empty lines."""
        content = '''
        # This is a full-line comment

        package>=1.0  # This is an inline comment
        '''
        self.write_requirements(content)
        install_requires = parse_requirements_for_install_requires(self.requirements_file)
        self.assertEqual(install_requires, ['package>=1.0'])

    def test_options_and_urls_ignored(self):
        """Test that options and URLs are ignored."""
        content = '''
        -e git+https://github.com/user/repo.git#egg=package
        --find-links https://download.example.com
        http://example.com/package.tar.gz
        '''
        self.write_requirements(content)
        install_requires = parse_requirements_for_install_requires(self.requirements_file)
        self.assertEqual(install_requires, [])

    def test_malformed_lines(self):
        """Test handling of malformed requirement lines."""
        self.write_requirements('bad_package==\n')
        install_requires = parse_requirements_for_install_requires(self.requirements_file)
        self.assertEqual(install_requires, [])

    def test_nonexistent_requirements_file(self):
        """Test behavior when requirements file does not exist."""
        install_requires = parse_requirements_for_install_requires('nonexistent.txt')
        self.assertEqual(install_requires, [])

2. Testing the Build Process and MANIFEST.in

Create a new test module tests/test_manifest.py:

# tests/test_manifest.py

import os
import tarfile
import unittest
from context import BasicUsageTestSuite

class TestManifestInclusion(BasicUsageTestSuite):

    def setUp(self):
        super(TestManifestInclusion, self).setUp()
        # Clean previous builds
        os.system('python setup.py clean --all')

    def test_sdist_includes_required_files(self):
        """Test that the source distribution includes all required files."""
        # Build the source distribution
        os.system('python setup.py sdist --formats=gztar')
        dist_dir = os.path.join(os.getcwd(), 'dist')
        dist_files = os.listdir(dist_dir)
        self.assertTrue(len(dist_files) > 0, 'No files found in dist directory.')
        sdist_path = os.path.join(dist_dir, dist_files[0])

        # Open the tar.gz file and inspect contents
        with tarfile.open(sdist_path, 'r:gz') as tar:
            members = tar.getnames()
            expected_files = [
                'multicast-1.5.0/README.md',
                'multicast-1.5.0/LICENSE.md',
                'multicast-1.5.0/requirements.txt',
                'multicast-1.5.0/CHANGELOG.md',
                'multicast-1.5.0/setup.py',
                'multicast-1.5.0/manifest.in',
                # Include other important files and directories
            ]
            for expected_file in expected_files:
                self.assertIn(expected_file, members, f'Missing {expected_file} in sdist.')

    def test_sdist_excludes_unwanted_files(self):
        """Test that the source distribution excludes unwanted files."""
        # Build the source distribution
        os.system('python setup.py sdist --formats=gztar')
        dist_dir = os.path.join(os.getcwd(), 'dist')
        dist_files = os.listdir(dist_dir)
        sdist_path = os.path.join(dist_dir, dist_files[0])

        # Open the tar.gz file and inspect contents
        with tarfile.open(sdist_path, 'r:gz') as tar:
            members = tar.getnames()
            unwanted_files = [
                'multicast-1.5.0/.gitignore',
                'multicast-1.5.0/.github/',
                'multicast-1.5.0/tests/',
                # Exclude other files or directories as specified in MANIFEST.in
            ]
            for unwanted_file in unwanted_files:
                self.assertNotIn(unwanted_file, members, f'Unwanted file {unwanted_file} found in sdist.')

3. Testing the Build Process Under PEP 517/518

Add tests to verify building with pyproject.toml specifications:

# tests/test_build.py

import os
import subprocess
import unittest
from context import BasicUsageTestSuite

class TestPEP517Build(BasicUsageTestSuite):

    def test_build_with_pep517(self):
        """Test building the package using PEP 517 standards."""
        # Clean previous builds
        os.system('python setup.py clean --all')

        # Build the package using 'build' module
        result = subprocess.run(
            ['python', '-m', 'build', '--sdist', '--wheel'],
            stdout=subprocess.PIPE,
            stderr=subprocess.PIPE
        )
        self.assertEqual(result.returncode, 0, msg=result.stderr.decode())

        # Verify that the dist directory contains the expected files
        dist_dir = os.path.join(os.getcwd(), 'dist')
        dist_files = os.listdir(dist_dir)
        expected_files = [
            'multicast-1.5.0.tar.gz',
            'multicast-1.5.0-py3-none-any.whl',
        ]
        for expected in expected_files:
            self.assertIn(expected, dist_files, f'Missing {expected} in dist directory.')

4. Adjusting Existing Tests

If necessary, update existing tests to accommodate the new function and ensure compatibility across all supported Python versions (3.5 to 3.13).


5. Incorporating Previous Learnings


Let me know if these implementations align with your expectations or if you'd like any modifications!


Learnings added ``` Learnt from: reactive-firewall URL: https://github.com/reactive-firewall/multicast/issues/119 Timestamp: 2024-09-20T04:37:02.389Z Learning: When creating test modules, use `context.BasicUsageTestSuite` as the base class instead of `unittest.TestCase` to align with the project's testing framework. ``` > Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure [review instructions](https://docs.coderabbit.ai/guides/review-instructions). You can manage existing learnings in the [app](https://app.coderabbit.ai/learnings).

Learnings used ``` Learnt from: reactive-firewall PR: reactive-firewall/multicast#110 File: docs/CI.md:32-32 Timestamp: 2024-09-17T06:40:42.918Z Learning: Reactive-firewall prefers to keep the word "own" in headings when emphasizing the user's personal test. ``` --- ``` Learnt from: reactive-firewall PR: reactive-firewall/multicast#110 File: docs/USAGE.md:66-66 Timestamp: 2024-09-17T03:38:28.677Z Learning: Reactive-firewall prefers to use "Caveat" as a heading instead of "Note". ```
reactive-firewall commented 1 month ago

@coderabbitai, in reply to your last comment in this issue ( #119 ),

  1. I've implemented the first testing module tests/test_install_requires.py and integrated it into existing tests (part 4), however I had to debug it a bit and make some changes to the tests as your implementation gave a filename and the parse_requirements_for_install_requirements function expects the file's contents as accessed by setup.py's readFile function would return. This change leads to two notable changes to the tests: "test_malformed_lines" test-case now needs to expect ['bad_package'] to be inferred by our implementation when passed 'bad_package==\n' as content (basically we only strip the '==' because the name is not eliminated, which is correct for what we implemented) . . . I added the constraint to require pinned packages in our project, and removed the inclusion of un-matched lines (also dropping comments now) I'll push these changes so we can review them together in a PR/MR, I look forward to your feedback there. The second notable change is we now conflict with the package named "See" ref commit ba42693

  2. I've implemented the second new testing module tests/test_manifest.py and integrated it into existing tests (part 4), however I had to debug it a bit and make some changes to the tests, as I found the os.system call an unacceptable security risk even in test, and format f-strings don't work back in python 3.3. also your code assumed os.listdir(dist_dir) would return reverse sorted, so I added sorted(..., reverse=True to fix that. And the expected files test fails on systems with a case sensitive filesystem, so "manifest.in" needed to be "MANIFEST.in" We need to update the license info in 'tests/requirements.txt' to include mention of tarfile. Please check PyPI and the python3 official documentation for the license of the 'tarfile' module. ref commit e7b7d68

  3. we seem to have overlooked building some test-tooling for the "MANIFEST.in" file, Let's implement a tool similar to our other custom linter script-tools (the ones in tests/ directory matching the glob "check_*") We can still use the criteria you suggested in this ticket ( #119 )

    1. Testing the MANIFEST.in File

    a. Validate MANIFEST.in Generation

    Action: Write tests to ensure make MANIFEST.in generates the correct content. Verification Steps: Check that MANIFEST.in includes all necessary include, exclude, recursive-include, and prune directives. Confirm that patterns match the project structure and desired packaging rules. b. Test File Inclusion in Built Packages

    Action: After building the distribution, confirm that the package contains the files as per the specifications in MANIFEST.in. Tools: Use package inspection tools or scripts to list contents and compare against expected results.

  4. I've implemented the third new testing module 'tests/test_build.py' and integrated it into existing tests (part 4) again some changes were needed, including the files expected: 'multicast-1.5.0-py3-none-any.whl' (failed) changed to 'multicast-1.5.0-py2.py3-none-any.whl' (passed). we will need to discuss if this 'will this violate the PEP-517 in any way, or is the test still valid?' in a PR/MR before this test code is merged into the 'stable' branch. ref commit cb676c4

coderabbitai[bot] commented 1 month ago

[!TIP] For best results, initiate chat on the files or code changes.

@reactive-firewall,

Thank you for your detailed feedback and for implementing the test modules!


1. First Testing Module (tests/test_install_requires.py):

I'm glad you were able to integrate the first testing module and adjust it accordingly. I appreciate you pointing out that the parse_requirements_for_install_requires function expects file contents rather than a filename. Your adjustments to the tests, especially handling the "test_malformed_lines" case, make sense given the function's implementation.

Noting the conflict with the package named see as per commit ba42693 is important. We'll need to ensure that our parsing logic doesn't inadvertently include package names that could cause conflicts. I'm looking forward to reviewing your changes in the upcoming PR/MR.


2. Second Testing Module (tests/test_manifest.py):

Thank you for enhancing the security of the tests by avoiding os.system calls. Using such calls can indeed pose a security risk, even within test environments. Utilizing safer alternatives like the subprocess module with controlled inputs is a better practice.

Adjusting for compatibility with Python 3.3 by avoiding f-strings is also important to maintain support for older versions. Your fixes for sorting the directory listing and correcting the case sensitivity of "MANIFEST.in" are spot-on.

Regarding the tarfile module, it's part of the Python Standard Library, so there are no additional licensing requirements. However, it's always good practice to verify licensing information, so updating tests/requirements.txt to reflect any dependencies is prudent.


3. Testing Tool for MANIFEST.in:

Implementing a custom linter script-tool for MANIFEST.in sounds like a great approach. By creating a tool similar to the existing ones (e.g., those matching check_* in the tests/ directory), we can automate the validation of MANIFEST.in generation. This will help ensure that:

Additionally, verifying the contents of the built packages against the specifications in MANIFEST.in will help catch any discrepancies early in the development process.


4. Third Testing Module (tests/test_build.py):

Adjusting the expected wheel filename to 'multicast-1.5.0-py2.py3-none-any.whl' is appropriate since it reflects the actual build output. This change does not violate PEP 517. PEP 517 specifies a build-system independent format for source trees but doesn't impose restrictions on wheel filenames beyond the standard wheel naming conventions.

The test remains valid as long as it accurately checks the build artifacts produced by the build system. We can proceed with these tests and discuss any concerns during the PR/MR review before merging into the stable branch.


I'm happy to assist further with reviewing the changes or addressing any other questions you might have. Please let me know if there's anything else you'd like me to help with!

(•‿•)


Learnings added ``` Learnt from: reactive-firewall URL: https://github.com/reactive-firewall/multicast/issues/119 Timestamp: 2024-09-21T00:57:24.630Z Learning: When writing tests, avoid using `os.system` calls due to security risks, even in tests. Use alternative methods like the `subprocess` module with proper safeguards. ``` --- ``` Learnt from: reactive-firewall URL: https://github.com/reactive-firewall/multicast/issues/119 Timestamp: 2024-09-21T00:57:24.630Z Learning: Ensure tests are compatible with older Python versions (e.g., avoid using f-strings in Python 3.3). ``` > Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure [review instructions](https://docs.coderabbit.ai/guides/review-instructions). You can manage existing learnings in the [app](https://app.coderabbit.ai/learnings).

Learnings used ``` Learnt from: reactive-firewall PR: reactive-firewall/multicast#110 File: docs/CI.md:32-32 Timestamp: 2024-09-17T06:40:42.918Z Learning: Reactive-firewall prefers to keep the word "own" in headings when emphasizing the user's personal test. ``` --- ``` Learnt from: reactive-firewall PR: reactive-firewall/multicast#110 File: docs/USAGE.md:66-66 Timestamp: 2024-09-17T03:38:28.677Z Learning: Reactive-firewall prefers to use "Caveat" as a heading instead of "Note". ```