ajitesh123 / auto-review-ai

๐Ÿš€ AI-Powered Performance Review Generator
https://perfor-ai.streamlit.app/
2 stars 1 forks source link

Archieai: Add Audio Input for Performance Reviews #113

Open ajitesh123 opened 2 months ago

ajitesh123 commented 2 months ago

Details

No response

Branch

No response

greptile-apps[bot] commented 2 months ago
  1. In app.py, add an audio file uploader to the Streamlit UI:
    
    import streamlit as st
    from review import ReviewRequest, generate_review, DEFAULT_QUESTIONS
    from self_review import SelfReviewRequest, generate_self_review
    import speech_recognition as sr

Streamlit UI

st.set_page_config(page_title="Performance Review Assistant", layout="wide")

Sidebar for common elements

with st.sidebar: st.title("Review Settings") review_type = st.radio("Select Review Type", ["Performance Review", "Self-Review"]) llm_type = st.selectbox('Select LLM Type', ['openai', 'google', 'anthropic', 'groq']) model_size = st.selectbox('Select Model Size', ['small', 'medium', 'large']) user_api_key = st.text_input('Your API Key', type="password")

if review_type == "Performance Review": st.title("Write Performance Review in a Minute")

st.text("""If no question is passed, the following are considered:
        1. Describe example(s) of the topics selected. What was the context? What actions did they take?
        2. In your opinion, what impact did their actions have?
        3. What recommendations do you have for their growth and development? Your feedback can be about any area of their work.
        """)

your_role = st.text_input('Your Role')
candidate_role = st.text_input('Candidate Role')
perf_question = st.text_area('Performance Review Questions (one per line)', height=100)
your_review = st.text_area('Briefly describe your experience of working with the candidate including project, responsibility of candidate, unique things they did etc., in free flow writing', height=200)
audio_file = st.file_uploader("Upload an audio file", type=["wav", "mp3"])

if st.button('Generate Performance Review'):
    if not user_api_key:
        st.error("Please enter your API key in the sidebar.")
    elif not your_role or not candidate_role or not your_review:
        st.error("Please fill in all required fields.")
    else:
        try:
            if audio_file is not None:
                recognizer = sr.Recognizer()
                audio_data = sr.AudioFile(audio_file)
                with audio_data as source:
                    audio = recognizer.record(source)
                your_review += recognizer.recognize_google(audio)

            questions = perf_question.split('\n') if perf_question else DEFAULT_QUESTIONS.split('\n')
            review_request = ReviewRequest(
                your_role=your_role,
                candidate_role=candidate_role,
                perf_question="\n".join(questions),
                your_review=your_review,
                llm_type=llm_type,
                user_api_key=user_api_key,
                model_size=model_size
            )
            review = generate_review(**review_request.model_dump())
            for qa in review:
                st.markdown(f"**{qa['question']}**")
                st.markdown(qa['answer'])
                st.markdown("---")
        except Exception as e:
            st.error(f"An error occurred: {str(e)}")

else: # Self-Review st.title("Generate Your Self-Review") st.text("""Provide a text dump of your performance information, specific questions you want to address, and any additional instructions for the AI to consider while generating your self-review.""") text_dump = st.text_area('Text Dump (information about your performance)', height=200) questions = st.text_area('Questions to Answer in Self-Review (one per line)', height=100) instructions = st.text_area('Additional Instructions (optional)', height=100) audio_file = st.file_uploader("Upload an audio file", type=["wav", "mp3"])

if st.button('Generate Self-Review'):
    if not user_api_key:
        st.error("Please enter your API key in the sidebar.")
    elif not text_dump or not questions:
        st.error("Please provide both the text dump and questions.")
    else:
        try:
            if audio_file is not None:
                recognizer = sr.Recognizer()
                audio_data = sr.AudioFile(audio_file)
                with audio_data as source:
                    audio = recognizer.record(source)
                text_dump += recognizer.recognize_google(audio)

            question_list = [q.strip() for q in questions.split('\n') if q.strip()]
            self_review_request = SelfReviewRequest(
                text_dump=text_dump,
                questions=question_list,
                instructions=instructions if instructions else None,
                llm_type=llm_type,
                user_api_key=user_api_key,
                model_size=model_size
            )
            self_review = generate_self_review(**self_review_request.model_dump())
            for qa in self_review:
                st.markdown(f"**{qa['question']}**")
                st.markdown(qa['answer'])
                st.markdown("---")
        except Exception as e:
            st.error(f"An error occurred: {str(e)}")
2. In `app_fastapi.py`, add an endpoint to accept audio files:
```python
from fastapi import FastAPI, HTTPException, UploadFile, File
from review import ReviewRequest, generate_review
from self_review import SelfReviewRequest, generate_self_review
from fastapi.middleware.cors import CORSMiddleware
import speech_recognition as sr

app = FastAPI()
app.add_middleware(
    CORSMiddleware,
    allow_origins=["*"],
    allow_credentials=True,
    allow_methods=["*"],
    allow_headers=["*"],
)

@app.post("/generate_review")
async def api_generate_review(request: ReviewRequest, audio_file: UploadFile = File(None)):
    try:
        if audio_file is not None:
            recognizer = sr.Recognizer()
            audio_data = sr.AudioFile(audio_file.file)
            with audio_data as source:
                audio = recognizer.record(source)
            request.your_review += recognizer.recognize_google(audio)
        review = generate_review(
            request.your_role,
            request.candidate_role,
            request.perf_question,
            request.your_review,
            request.llm_type,
            request.user_api_key,
            request.model_size
        )
        return {"review": review}
    except Exception as e:
        raise HTTPException(status_code=500, detail=str(e))

@app.post("/generate_self_review")
async def api_generate_self_review(request: SelfReviewRequest, audio_file: UploadFile = File(None)):
    try:
        if audio_file is not None:
            recognizer = sr.Recognizer()
            audio_data = sr.AudioFile(audio_file.file)
            with audio_data as source:
                audio = recognizer.record(source)
            request.text_dump += recognizer.recognize_google(audio)
        review = generate_self_review(
            request.text_dump,
            request.questions,
            request.instructions,
            request.llm_type,
            request.user_api_key,
            request.model_size
        )
        return {"self_review": review}
    except Exception as e:
        raise HTTPException(status_code=500, detail=str(e))

@app.get("/")
async def root():
    return {"message": "Welcome to the Performance Review API"}

if __name__ == "__main__":
    import uvicorn
    uvicorn.run(app, host="0.0.0.0", port=8000)

References

/app.py /app_fastapi.py /review.py /self_review.py

#### About Greptile This response provides a starting point for your research, not a precise solution. Help us improve! Please leave a ๐Ÿ‘ if this is helpful and ๐Ÿ‘Ž if it is irrelevant. [Ask Greptile](https://app.greptile.com/chat/github/ajitesh123/perf-review-ai/main) ยท [Edit Issue Bot Settings](https://app.greptile.com/apps/github)
ajitesh123 commented 2 months ago

@Archie

dev-archie-ai-code-explain-pr[bot] commented 2 months ago
# ๐Ÿ“ Issue Review Complete

Thank you for raising this issue. Here's my analysis:

1. The problem seems to be related to [specific detail].
2. A potential solution could be [suggested approach].
3. This issue might be connected to [related issue or PR].

Next steps:
- [ ] Verify the proposed solution
- [ ] Update affected documentation
- [ ] Create a pull request with the fix

Please let me know if you need any further clarification or assistance.

---
_Review completed by ArchieAI_
dev-archie-ai-code-explain-pr[bot] commented 2 months ago
# ๐Ÿ” Analyzing Issue

```
[===       ]  ๐™‹๐™ง๐™ค๐™˜๐™š๐™จ๐™จ๐™ž๐™ฃ๐™œ...
```

Please wait while I review this issue. This may take a few moments.

- ๐Ÿ“Š Analyzing issue details
- ๐Ÿง  Evaluating potential solutions
- ๐Ÿ”ฌ Checking for related issues
- ๐Ÿ“ Preparing detailed response

*This comment will be updated with the final review shortly.*
dev-archie-ai-code-explain-pr[bot] commented 2 months ago
# ๐Ÿ“ Issue Review Complete

Thank you for raising this issue. Here's my analysis:

1. The problem seems to be related to [specific detail].
2. A potential solution could be [suggested approach].
3. This issue might be connected to [related issue or PR].

Next steps:
- [ ] Verify the proposed solution
- [ ] Update affected documentation
- [ ] Create a pull request with the fix

Please let me know if you need any further clarification or assistance.

---
*Review completed by ArchieAI*
dev-archie-ai-code-explain-pr[bot] commented 2 months ago
# ๐Ÿ“ Issue Review Complete

Thank you for raising this issue. Here's my analysis:

1. The problem seems to be related to [specific detail].
2. A potential solution could be [suggested approach].
3. This issue might be connected to [related issue or PR].

Next steps:
- [ ] Verify the proposed solution
- [ ] Update affected documentation
- [ ] Create a pull request with the fix

Please let me know if you need any further clarification or assistance.

---
*Review completed by ArchieAI*
dev-archie-ai-code-explain-pr[bot] commented 2 months ago

one to just get this going

dev-archie-ai-code-explain-pr[bot] commented 2 months ago

๐Ÿ” Processing Issue [=== ] ๐™‹๐™ง๐™ค๐™˜๐™š๐™จ๐™จ๐™ž๐™ฃ๐™œ...

We will do the following steps: ๐Ÿ” Problem discovery: Understand the issue and find related files/code ๐Ÿ’ก Determine approach: Decide higher level plan approach ๐Ÿ“ Implementation plan: Create a detailed plan of action ๐Ÿ’ป Code generation: Write the code to solve the issue

This issue will be updated as we make progress.

ajitesh123 commented 1 month ago

Design Proposal

Root Cause Analysis

The root cause of the issue appears to be in the handle_push_event function within the app/routers/git_service.py file. Specifically, the error occurs when trying to access the "id" key of the commit_data dictionary, which is None. This suggests that the get_detailed_commit_info function in app/utils/repo/github_utils.py is not returning the expected data structure, possibly due to missing or invalid data in the GitHub webhook payload.

The most likely resolution for this issue involves improving error handling and data validation in both the handle_push_event and get_detailed_commit_info functions. By adding proper checks for None values and implementing more robust error handling, we can prevent the TypeError and provide more informative error messages or graceful fallbacks when dealing with unexpected data structures.

Proposed Changes

Update error handling in handle_push_event function

  1. Modify the handle_push_event function in app/routers/git_service.py:

    • Add a check for None commit_data before accessing its keys
    • Implement a try-except block to catch potential errors
    • Log relevant information for debugging purposes
  2. Update the commit creation process:

    • Add error handling for cases where commit_data is None or missing expected keys
    • Provide default values or skip commit creation if necessary data is missing

Here's the implementation of these changes:

  1. Update the handle_push_event function in app/routers/git_service.py:

    • Add the following imports at the top of the file:
      from app.logger import logger
    • Modify the handle_push_event function as follows:
      
      async def handle_push_event(payload: dict, installation_id: str, user, repo):
      handler = GitHubCommitEvent(payload)
      ref = handler.ref
      if not ref.startswith("refs/heads/"):
      return  # Not a branch push

    repository = handler.repository_name

    github = Github(github_installation_id=installation_id, github_is_read_access=True)

    prs = await github.get_all_pull_requests(repository, params={ "head": f"{handler.repository['owner']['name']}:{handler.branch_name}", "state": "open" })

    Get the commit information

    commit_data = handler.head_commit if not commit_data: logger.warning("No commit data found in the payload") return {"success": True, "msg": "No commit data found in the payload"}

    try: commit_info = await github.get_detailed_commit_info(repository, commit_data["id"])

    logger.info(f"Commit info: {commit_info}")
    
    for pr in prs["pull_requests"]:
        pr_node_id = pr["node_id"]
    
        pr_document = await get_pull_request_by_filter({ "pr_origin" : "GITHUB", "git_node_id": pr_node_id })
    
        if pr_document:
            commit_info["pr_id"] = pr_document["_id"]
            await create_commit(commit_info)
        else:
            logger.warning(f"PR document not found for node_id: {pr_node_id}")
    
    return {"success": True, "msg": "Processed push event successfully"}

    except Exception as e: logger.error(f"Error processing push event: {str(e)}") return {"success": False, "msg": f"Error processing push event: {str(e)}"}

Update error handling in get_detailed_commit_info function

  1. Modify the get_detailed_commit_info function in app/utils/repo/github_utils.py:
    • Add error handling for API requests
    • Implement data validation for the response
    • Return None or a default structure if the required data is missing

Here's the implementation of these changes:

  1. Update the get_detailed_commit_info function in app/utils/repo/github_utils.py:

    • Add the following imports at the top of the file if not already present:
      from app.logger import logger
      from typing import Optional, Dict, Any
    • Modify the get_detailed_commit_info function as follows:
      
      async def get_detailed_commit_info(self, repo: str, commit_sha: str) -> Optional[Dict[str, Any]]:
      url = f"{self.base_url}/repos/{repo}/commits/{commit_sha}"
      headers = {"Authorization": f"Bearer {await self.get_access_token()}"}

    try: async with AsyncClient() as client: response = await client.get(url, headers=headers) response.raise_for_status() commit_data = response.json()

    if not isinstance(commit_data, dict):
        logger.error(f"Unexpected commit data format: {commit_data}")
        return None
    
    commit = commit_data.get("commit", {})
    author = commit.get("author", {})
    committer = commit.get("committer", {})
    
    return {
        "sha": commit_data.get("sha"),
        "tree_sha": commit_data.get("commit", {}).get("tree", {}).get("sha"),
        "comment_count": commit.get("comment_count", 0),
        "author": author.get("name"),
        "commit_time": committer.get("date"),
        "files_changed": [
            {
                "filename": file.get("filename"),
                "additions": file.get("additions", 0),
                "deletions": file.get("deletions", 0),
                "changes": file.get("changes", 0),
                "status": file.get("status")
            }
            for file in commit_data.get("files", [])
        ],
        "message": commit.get("message")
    }

    except Exception as e: logger.error(f"Error fetching commit details for {repo}/{commit_sha}: {str(e)}") return None

Add logging for debugging

  1. Add logging statements in the handle_push_event function:

    • Log the incoming payload
    • Log the results of key operations
  2. Add logging statements in the get_detailed_commit_info function:

    • Log the API request details
    • Log the response data

Here's the implementation of these changes:

  1. Update the handle_push_event function in app/routers/git_service.py:

    async def handle_push_event(payload: dict, installation_id: str, user, repo):
    logger.info(f"Received push event payload: {payload}")
    handler = GitHubCommitEvent(payload)
    ref = handler.ref
    if not ref.startswith("refs/heads/"):
        logger.info(f"Ignored push to non-branch ref: {ref}")
        return {"success": True, "msg": f"Ignored push to non-branch ref: {ref}"}
    
    repository = handler.repository_name
    
    github = Github(github_installation_id=installation_id, github_is_read_access=True)
    
    prs = await github.get_all_pull_requests(repository, params={
        "head": f"{handler.repository['owner']['name']}:{handler.branch_name}",
        "state": "open"
    })
    logger.info(f"Found {len(prs['pull_requests'])} open PRs for branch {handler.branch_name}")
    
    # Get the commit information
    commit_data = handler.head_commit
    if not commit_data:
        logger.warning("No commit data found in the payload")
        return {"success": True, "msg": "No commit data found in the payload"}
    
    logger.info(f"Processing commit: {commit_data['id']}")
    
    try:
        commit_info = await github.get_detailed_commit_info(repository, commit_data["id"])
    
        logger.info(f"Commit info: {commit_info}")
    
        for pr in prs["pull_requests"]:
            pr_node_id = pr["node_id"]
    
            pr_document = await get_pull_request_by_filter({ "pr_origin" : "GITHUB", "git_node_id": pr_node_id })
    
            if pr_document:
                commit_info["pr_id"] = pr_document["_id"]
                created_commit = await create_commit(commit_info)
                logger.info(f"Created commit entry: {created_commit}")
            else:
                logger.warning(f"PR document not found for node_id: {pr_node_id}")
    
        return {"success": True, "msg": "Processed push event successfully"}
    except Exception as e:
        logger.error(f"Error processing push event: {str(e)}", exc_info=True)
        return {"success": False, "msg": f"Error processing push event: {str(e)}"}
  2. Update the get_detailed_commit_info function in app/utils/repo/github_utils.py:

    async def get_detailed_commit_info(self, repo: str, commit_sha: str) -> Optional[Dict[str, Any]]:
    url = f"{self.base_url}/repos/{repo}/commits/{commit_sha}"
    headers = {"Authorization": f"Bearer {await self.get_access_token()}"}
    
    logger.info(f"Fetching commit details for {repo}/{commit_sha}")
    
    try:
        async with AsyncClient() as client:
            response = await client.get(url, headers=headers)
            response.raise_for_status()
            commit_data = response.json()
    
        logger.info(f"Received commit data: {commit_data}")
    
        if not isinstance(commit_data, dict):
            logger.error(f"Unexpected commit data format: {commit_data}")
            return None
    
        commit = commit_data.get("commit", {})
        author = commit.get("author", {})
        committer = commit.get("committer", {})
    
        result = {
            "sha": commit_data.get("sha"),
            "tree_sha": commit_data.get("commit", {}).get("tree", {}).get("sha"),
            "comment_count": commit.get("comment_count", 0),
            "author": author.get("name"),
            "commit_time": committer.get("date"),
            "files_changed": [
                {
                    "filename": file.get("filename"),
                    "additions": file.get("additions", 0),
                    "deletions": file.get("deletions", 0),
                    "changes": file.get("changes", 0),
                    "status": file.get("status")
                }
                for file in commit_data.get("files", [])
            ],
            "message": commit.get("message")
        }
    
        logger.info(f"Processed commit info: {result}")
        return result
    except Exception as e:
        logger.error(f"Error fetching commit details for {repo}/{commit_sha}: {str(e)}", exc_info=True)
        return None

Update GitHubCommitEvent class to handle missing data

  1. Modify the GitHubCommitEvent class in app/utils/repo/github_utils.py:
    • Add error handling for missing or invalid data in the payload
    • Provide default values for required fields
    • Log warnings for missing data

Here's the implementation of these changes:

  1. Update the GitHubCommitEvent class in app/utils/repo/github_utils.py:

    class GitHubCommitEvent:
    def __init__(self, payload: dict):
        self.payload = payload
        self.ref = payload.get('ref', '')
        self.repository = payload.get('repository', {})
        self.head_commit = payload.get('head_commit')
    
        if not self.repository:
            logger.warning("Repository information missing in the payload")
    
        if not self.head_commit:
            logger.warning("Head commit information missing in the payload")
    
        self.branch_name = self.ref.split('/')[-1] if self.ref.startswith('refs/heads/') else ''
    
        if not self.branch_name:
            logger.warning(f"Unable to determine branch name from ref: {self.ref}")
    
    @property
    def repository_name(self) -> str:
        return self.repository.get('full_name', '')
    
    @property
    def commit_sha(self) -> str:
        return self.head_commit.get('id', '') if self.head_commit else ''
    
    @property
    def commit_message(self) -> str:
        return self.head_commit.get('message', '') if self.head_commit else ''
    
    @property
    def commit_author(self) -> dict:
        return self.head_commit.get('author', {}) if self.head_commit else {}
    
    @property
    def commit_timestamp(self) -> str:
        return self.head_commit.get('timestamp', '') if self.head_commit else ''
    
    def get_commit_files(self) -> List[dict]:
        if not self.head_commit:
            return []
        return self.head_commit.get('added', []) + self.head_commit.get('removed', []) + self.head_commit.get('modified', [])
    
    def __str__(self) -> str:
        return f"GitHubCommitEvent(repo={self.repository_name}, branch={self.branch_name}, sha={self.commit_sha})"

Add unit tests for handle_push_event function

  1. Create a new test file for git_service.py:
    • Create a new file named test_git_service.py in the tests directory
    • Import necessary modules and functions
    • Write unit tests for the handle_push_event function

Here's the implementation of these changes:

  1. Create a new file named test_git_service.py in the tests directory:
    
    import pytest
    from unittest.mock import AsyncMock, patch
    from app.routers.git_service import handle_push_event
    from app.utils.repo.github_utils import Github, GitHubCommitEvent

@pytest.fixture def mock_payload(): return { "ref": "refs/heads/main", "repository": { "full_name": "test/repo", "owner": {"name": "test"} }, "head_commit": { "id": "abc123", "message": "Test commit", "author": {"name": "Test Author"}, "timestamp": "2023-05-01T12:00:00Z" } }

@pytest.fixture def mock_user(): return {"_id": "user123"}

@pytest.fixture def mock_repo(): return {"_id": "repo123"}

@pytest.mark.asyncio async def test_handle_push_event_success(mock_payload, mock_user, mock_repo): with patch('app.routers.git_service.Github', new_callable=AsyncMock) as MockGithub, \ patch('app.routers.git_service.get_pull_request_by_filter', new_callable=AsyncMock) as mock_get_pr, \ patch('app.routers.git_service.create_commit', new_callable=AsyncMock) as mock_create_commit:

    mock_github_instance = MockGithub.return_value
    mock_github_instance.get_all_pull_requests.return_value = {
        "pull_requests": [{"node_id": "pr123"}]
    }
    mock_github_instance.get_detailed_commit_info.return_value = {
        "sha": "abc123",
        "tree_sha": "tree123",
        "comment_count": 0,
        "author": "Test Author",
        "commit_time": "2023-05-01T12:00:00Z",
        "files_changed": [],
        "message": "Test commit"
    }

    mock_get_pr.return_value.return_value = {"_id": "pr123", "pr_origin": "GITHUB", "git_node_id": "pr123"}
    mock_create_commit.return_value = {"_id": "commit123"}

    result = await handle_push_event(mock_payload, "installation123", mock_user, mock_repo)

    assert result == {"success": True, "msg": "Processed push event successfully"}
    MockGithub.assert_called_once_with(github_installation_id="installation123", github_is_read_access=True)
    mock_github_instance.get_all_pull_requests.assert_called_once_with(
        "test/repo",
        params={"head": "test:main", "state": "open"}
    )
    mock_github_instance.get_detailed_commit_info.assert_called_once_with("test/repo", "abc123")
    mock_get_pr.assert_called_once_with({"pr_origin": "GITHUB", "git_node_id": "pr123"})
    mock_create_commit.assert_called_once()

@pytest.mark.asyncio async def test_handle_push_event_no_commit_data(mock_payload, mock_user, mock_repo): mock_payload["head_commit"] = None result = await handle_push_event(mock_payload, "installation123", mock_user, mock_repo) assert result == {"success": True, "msg": "No commit data found in the payload"}

@pytest.mark.asyncio async def test_handle_push_event_non_branch_push(mock_payload, mock_user, mock_repo): mock_payload["ref"] = "refs/tags/v1.0" result = await handle_push_event(mock_payload, "installation123", mock_user, mock_repo) assert result == {"success": True, "msg": "Ignored push to non-branch ref: refs/tags/v1.0"}

@pytest.mark.asyncio async def test_handle_push_event_error(mock_payload, mock_user, mock_repo): with patch('app.routers.git_service.Github', new_callable=AsyncMock) as MockGithub: mock_github_instance = MockGithub.return_value mock_github_instance.get_all_pull_requests.side_effect = Exception("API Error")

    result = await handle_push_event(mock_payload, "installation123", mock_user, mock_repo)

    assert result == {"success": False, "msg": "Error processing push event: API Error"}

Add more test cases as needed to cover different scenarios and edge cases


These unit tests cover the main scenarios for the `handle_push_event` function, including successful execution, handling missing commit data, non-branch pushes, and error cases. You can add more test cases to cover additional scenarios and edge cases as needed.

To run these tests, you'll need to have pytest and pytest-asyncio installed. You can run the tests using the command:

pytest tests/test_git_service.py



Make sure to adjust the import paths if necessary, depending on your project structure.
ajitesh123 commented 1 month ago

๐Ÿ“‹ Design Proposal

๐Ÿ” Root Cause Analysis

The root cause of the issue appears to be in the handle_push_event function within the app/routers/git_service.py file. Specifically, the error occurs when trying to access the "id" key of the commit_data dictionary, which is None. This suggests that the get_detailed_commit_info function in app/utils/repo/github_utils.py is not returning the expected data structure, possibly due to missing or invalid data in the GitHub webhook payload.

The most likely resolution for this issue involves improving error handling and data validation in both the handle_push_event and get_detailed_commit_info functions. By adding proper checks for None values and implementing more robust error handling, we can prevent the TypeError and provide more informative error messages or graceful fallbacks when dealing with unexpected data structures.

๐Ÿ’ก Proposed Changes

Issue 1

๐Ÿ”น Sub-request 1: Update error handling in handle_push_event function

  1. Modify the handle_push_event function in app/routers/git_service.py:

    • Add a check for None commit_data before accessing its keys
    • Implement a try-except block to catch potential errors
    • Log relevant information for debugging purposes
  2. Update the commit creation process:

    • Add error handling for cases where commit_data is None or missing expected keys
    • Provide default values or skip commit creation if necessary data is missing

Here's the implementation of these changes:

  1. Update the handle_push_event function in app/routers/git_service.py:

    • Add the following imports at the top of the file:
      from app.logger import logger
    • Modify the handle_push_event function as follows:
      
      async def handle_push_event(payload: dict, installation_id: str, user, repo):
      handler = GitHubCommitEvent(payload)
      ref = handler.ref
      if not ref.startswith("refs/heads/"):
      return  # Not a branch push

    repository = handler.repository_name

    github = Github(github_installation_id=installation_id, github_is_read_access=True)

    prs = await github.get_all_pull_requests(repository, params={ "head": f"{handler.repository['owner']['name']}:{handler.branch_name}", "state": "open" })

    Get the commit information

    commit_data = handler.head_commit if not commit_data: logger.warning("No commit data found in the payload") return {"success": True, "msg": "No commit data found in the payload"}

    try: commit_info = await github.get_detailed_commit_info(repository, commit_data["id"])

    logger.info(f"Commit info: {commit_info}")
    
    for pr in prs["pull_requests"]:
        pr_node_id = pr["node_id"]
    
        pr_document = await get_pull_request_by_filter({ "pr_origin" : "GITHUB", "git_node_id": pr_node_id })
    
        if pr_document:
            commit_info["pr_id"] = pr_document["_id"]
            await create_commit(commit_info)
        else:
            logger.warning(f"PR document not found for node_id: {pr_node_id}")
    
    return {"success": True, "msg": "Processed push event successfully"}

    except Exception as e: logger.error(f"Error processing push event: {str(e)}") return {"success": False, "msg": f"Error processing push event: {str(e)}"}

๐Ÿ”น Sub-request 2: Update error handling in get_detailed_commit_info function

  1. Modify the get_detailed_commit_info function in app/utils/repo/github_utils.py:
    • Add error handling for API requests
    • Implement data validation for the response
    • Return None or a default structure if the required data is missing

Here's the implementation of these changes:

  1. Update the get_detailed_commit_info function in app/utils/repo/github_utils.py:

    • Add the following imports at the top of the file if not already present:
      from app.logger import logger
      from typing import Optional, Dict, Any
    • Modify the get_detailed_commit_info function as follows:
      
      async def get_detailed_commit_info(self, repo: str, commit_sha: str) -> Optional[Dict[str, Any]]:
      url = f"{self.base_url}/repos/{repo}/commits/{commit_sha}"
      headers = {"Authorization": f"Bearer {await self.get_access_token()}"}

    try: async with AsyncClient() as client: response = await client.get(url, headers=headers) response.raise_for_status() commit_data = response.json()

    if not isinstance(commit_data, dict):
        logger.error(f"Unexpected commit data format: {commit_data}")
        return None
    
    commit = commit_data.get("commit", {})
    author = commit.get("author", {})
    committer = commit.get("committer", {})
    
    return {
        "sha": commit_data.get("sha"),
        "tree_sha": commit_data.get("commit", {}).get("tree", {}).get("sha"),
        "comment_count": commit.get("comment_count", 0),
        "author": author.get("name"),
        "commit_time": committer.get("date"),
        "files_changed": [
            {
                "filename": file.get("filename"),
                "additions": file.get("additions", 0),
                "deletions": file.get("deletions", 0),
                "changes": file.get("changes", 0),
                "status": file.get("status")
            }
            for file in commit_data.get("files", [])
        ],
        "message": commit.get("message")
    }

    except Exception as e: logger.error(f"Error fetching commit details for {repo}/{commit_sha}: {str(e)}") return None

๐Ÿ”น Sub-request 3: Add logging for debugging

  1. Add logging statements in the handle_push_event function:

    • Log the incoming payload
    • Log the results of key operations
  2. Add logging statements in the get_detailed_commit_info function:

    • Log the API request details
    • Log the response data

Here's the implementation of these changes:

  1. Update the handle_push_event function in app/routers/git_service.py:

    async def handle_push_event(payload: dict, installation_id: str, user, repo):
    logger.info(f"Received push event payload: {payload}")
    handler = GitHubCommitEvent(payload)
    ref = handler.ref
    if not ref.startswith("refs/heads/"):
        logger.info(f"Ignored push to non-branch ref: {ref}")
        return {"success": True, "msg": f"Ignored push to non-branch ref: {ref}"}
    
    repository = handler.repository_name
    
    github = Github(github_installation_id=installation_id, github_is_read_access=True)
    
    prs = await github.get_all_pull_requests(repository, params={
        "head": f"{handler.repository['owner']['name']}:{handler.branch_name}",
        "state": "open"
    })
    logger.info(f"Found {len(prs['pull_requests'])} open PRs for branch {handler.branch_name}")
    
    # Get the commit information
    commit_data = handler.head_commit
    if not commit_data:
        logger.warning("No commit data found in the payload")
        return {"success": True, "msg": "No commit data found in the payload"}
    
    logger.info(f"Processing commit: {commit_data['id']}")
    
    try:
        commit_info = await github.get_detailed_commit_info(repository, commit_data["id"])
    
        logger.info(f"Commit info: {commit_info}")
    
        for pr in prs["pull_requests"]:
            pr_node_id = pr["node_id"]
    
            pr_document = await get_pull_request_by_filter({ "pr_origin" : "GITHUB", "git_node_id": pr_node_id })
    
            if pr_document:
                commit_info["pr_id"] = pr_document["_id"]
                created_commit = await create_commit(commit_info)
                logger.info(f"Created commit entry: {created_commit}")
            else:
                logger.warning(f"PR document not found for node_id: {pr_node_id}")
    
        return {"success": True, "msg": "Processed push event successfully"}
    except Exception as e:
        logger.error(f"Error processing push event: {str(e)}", exc_info=True)
        return {"success": False, "msg": f"Error processing push event: {str(e)}"}
  2. Update the get_detailed_commit_info function in app/utils/repo/github_utils.py:

    async def get_detailed_commit_info(self, repo: str, commit_sha: str) -> Optional[Dict[str, Any]]:
    url = f"{self.base_url}/repos/{repo}/commits/{commit_sha}"
    headers = {"Authorization": f"Bearer {await self.get_access_token()}"}
    
    logger.info(f"Fetching commit details for {repo}/{commit_sha}")
    
    try:
        async with AsyncClient() as client:
            response = await client.get(url, headers=headers)
            response.raise_for_status()
            commit_data = response.json()
    
        logger.info(f"Received commit data: {commit_data}")
    
        if not isinstance(commit_data, dict):
            logger.error(f"Unexpected commit data format: {commit_data}")
            return None
    
        commit = commit_data.get("commit", {})
        author = commit.get("author", {})
        committer = commit.get("committer", {})
    
        result = {
            "sha": commit_data.get("sha"),
            "tree_sha": commit_data.get("commit", {}).get("tree", {}).get("sha"),
            "comment_count": commit.get("comment_count", 0),
            "author": author.get("name"),
            "commit_time": committer.get("date"),
            "files_changed": [
                {
                    "filename": file.get("filename"),
                    "additions": file.get("additions", 0),
                    "deletions": file.get("deletions", 0),
                    "changes": file.get("changes", 0),
                    "status": file.get("status")
                }
                for file in commit_data.get("files", [])
            ],
            "message": commit.get("message")
        }
    
        logger.info(f"Processed commit info: {result}")
        return result
    except Exception as e:
        logger.error(f"Error fetching commit details for {repo}/{commit_sha}: {str(e)}", exc_info=True)
        return None

๐Ÿ”น Sub-request 4: Update GitHubCommitEvent class to handle missing data

  1. Modify the GitHubCommitEvent class in app/utils/repo/github_utils.py:
    • Add error handling for missing or invalid data in the payload
    • Provide default values for required fields
    • Log warnings for missing data

Here's the implementation of these changes:

  1. Update the GitHubCommitEvent class in app/utils/repo/github_utils.py:

    class GitHubCommitEvent:
    def __init__(self, payload: dict):
        self.payload = payload
        self.ref = payload.get('ref', '')
        self.repository = payload.get('repository', {})
        self.head_commit = payload.get('head_commit')
    
        if not self.repository:
            logger.warning("Repository information missing in the payload")
    
        if not self.head_commit:
            logger.warning("Head commit information missing in the payload")
    
        self.branch_name = self.ref.split('/')[-1] if self.ref.startswith('refs/heads/') else ''
    
        if not self.branch_name:
            logger.warning(f"Unable to determine branch name from ref: {self.ref}")
    
    @property
    def repository_name(self) -> str:
        return self.repository.get('full_name', '')
    
    @property
    def commit_sha(self) -> str:
        return self.head_commit.get('id', '') if self.head_commit else ''
    
    @property
    def commit_message(self) -> str:
        return self.head_commit.get('message', '') if self.head_commit else ''
    
    @property
    def commit_author(self) -> dict:
        return self.head_commit.get('author', {}) if self.head_commit else {}
    
    @property
    def commit_timestamp(self) -> str:
        return self.head_commit.get('timestamp', '') if self.head_commit else ''
    
    def get_commit_files(self) -> List[dict]:
        if not self.head_commit:
            return []
        return self.head_commit.get('added', []) + self.head_commit.get('removed', []) + self.head_commit.get('modified', [])
    
    def __str__(self) -> str:
        return f"GitHubCommitEvent(repo={self.repository_name}, branch={self.branch_name}, sha={self.commit_sha})"

๐Ÿ”น Sub-request 5: Add unit tests for handle_push_event function

  1. Create a new test file for git_service.py:
    • Create a new file named test_git_service.py in the tests directory
    • Import necessary modules and functions
    • Write unit tests for the handle_push_event function

Here's the implementation of these changes:

  1. Create a new file named test_git_service.py in the tests directory:
    
    import pytest
    from unittest.mock import AsyncMock, patch
    from app.routers.git_service import handle_push_event
    from app.utils.repo.github_utils import Github, GitHubCommitEvent

@pytest.fixture def mock_payload(): return { "ref": "refs/heads/main", "repository": { "full_name": "test/repo", "owner": {"name": "test"} }, "head_commit": { "id": "abc123", "message": "Test commit", "author": {"name": "Test Author"}, "timestamp": "2023-05-01T12:00:00Z" } }

@pytest.fixture def mock_user(): return {"_id": "user123"}

@pytest.fixture def mock_repo(): return {"_id": "repo123"}

@pytest.mark.asyncio async def test_handle_push_event_success(mock_payload, mock_user, mock_repo): with patch('app.routers.git_service.Github', new_callable=AsyncMock) as MockGithub, \ patch('app.routers.git_service.get_pull_request_by_filter', new_callable=AsyncMock) as mock_get_pr, \ patch('app.routers.git_service.create_commit', new_callable=AsyncMock) as mock_create_commit:

    mock_github_instance = MockGithub.return_value
    mock_github_instance.get_all_pull_requests.return_value = {
        "pull_requests": [{"node_id": "pr123"}]
    }
    mock_github_instance.get_detailed_commit_info.return_value = {
        "sha": "abc123",
        "tree_sha": "tree123",
        "comment_count": 0,
        "author": "Test Author",
        "commit_time": "2023-05-01T12:00:00Z",
        "files_changed": [],
        "message": "Test commit"
    }

    mock_get_pr.return_value.return_value = {"_id": "pr123", "pr_origin": "GITHUB", "git_node_id": "pr123"}
    mock_create_commit.return_value = {"_id": "commit123"}

    result = await handle_push_event(mock_payload, "installation123", mock_user, mock_repo)

    assert result == {"success": True, "msg": "Processed push event successfully"}
    MockGithub.assert_called_once_with(github_installation_id="installation123", github_is_read_access=True)
    mock_github_instance.get_all_pull_requests.assert_called_once_with(
        "test/repo",
        params={"head": "test:main", "state": "open"}
    )
    mock_github_instance.get_detailed_commit_info.assert_called_once_with("test/repo", "abc123")
    mock_get_pr.assert_called_once_with({"pr_origin": "GITHUB", "git_node_id": "pr123"})
    mock_create_commit.assert_called_once()

@pytest.mark.asyncio async def test_handle_push_event_no_commit_data(mock_payload, mock_user, mock_repo): mock_payload["head_commit"] = None result = await handle_push_event(mock_payload, "installation123", mock_user, mock_repo) assert result == {"success": True, "msg": "No commit data found in the payload"}

@pytest.mark.asyncio async def test_handle_push_event_non_branch_push(mock_payload, mock_user, mock_repo): mock_payload["ref"] = "refs/tags/v1.0" result = await handle_push_event(mock_payload, "installation123", mock_user, mock_repo) assert result == {"success": True, "msg": "Ignored push to non-branch ref: refs/tags/v1.0"}

@pytest.mark.asyncio async def test_handle_push_event_error(mock_payload, mock_user, mock_repo): with patch('app.routers.git_service.Github', new_callable=AsyncMock) as MockGithub: mock_github_instance = MockGithub.return_value mock_github_instance.get_all_pull_requests.side_effect = Exception("API Error")

    result = await handle_push_event(mock_payload, "installation123", mock_user, mock_repo)

    assert result == {"success": False, "msg": "Error processing push event: API Error"}

Add more test cases as needed to cover different scenarios and edge cases


These unit tests cover the main scenarios for the `handle_push_event` function, including successful execution, handling missing commit data, non-branch pushes, and error cases. You can add more test cases to cover additional scenarios and edge cases as needed.

To run these tests, you'll need to have pytest and pytest-asyncio installed. You can run the tests using the command:

pytest tests/test_git_service.py



Make sure to adjust the import paths if necessary, depending on your project structure.
ajitesh123 commented 1 month ago

Design Proposal

๐Ÿ” Root Cause Analysis

The root cause of the issue appears to be in the handle_push_event function within the app/routers/git_service.py file. Specifically, the error occurs when trying to access the "id" key of the commit_data dictionary, which is None. This suggests that the get_detailed_commit_info function in app/utils/repo/github_utils.py is not returning the expected data structure, possibly due to missing or invalid data in the GitHub webhook payload.

The most likely resolution for this issue involves improving error handling and data validation in both the handle_push_event and get_detailed_commit_info functions. By adding proper checks for None values and implementing more robust error handling, we can prevent the TypeError and provide more informative error messages or graceful fallbacks when dealing with unexpected data structures.

๐Ÿ’ก Proposed Changes

1: Update error handling in handle_push_event function

  1. Modify the handle_push_event function in app/routers/git_service.py:

    • Add a check for None commit_data before accessing its keys
    • Implement a try-except block to catch potential errors
    • Log relevant information for debugging purposes
  2. Update the commit creation process:

    • Add error handling for cases where commit_data is None or missing expected keys
    • Provide default values or skip commit creation if necessary data is missing

Here's the implementation of these changes:

  1. Update the handle_push_event function in app/routers/git_service.py:

    • Add the following imports at the top of the file:
      from app.logger import logger
    • Modify the handle_push_event function as follows:
      
      async def handle_push_event(payload: dict, installation_id: str, user, repo):
      handler = GitHubCommitEvent(payload)
      ref = handler.ref
      if not ref.startswith("refs/heads/"):
      return  # Not a branch push

    repository = handler.repository_name

    github = Github(github_installation_id=installation_id, github_is_read_access=True)

    prs = await github.get_all_pull_requests(repository, params={ "head": f"{handler.repository['owner']['name']}:{handler.branch_name}", "state": "open" })

    Get the commit information

    commit_data = handler.head_commit if not commit_data: logger.warning("No commit data found in the payload") return {"success": True, "msg": "No commit data found in the payload"}

    try: commit_info = await github.get_detailed_commit_info(repository, commit_data["id"])

    logger.info(f"Commit info: {commit_info}")
    
    for pr in prs["pull_requests"]:
        pr_node_id = pr["node_id"]
    
        pr_document = await get_pull_request_by_filter({ "pr_origin" : "GITHUB", "git_node_id": pr_node_id })
    
        if pr_document:
            commit_info["pr_id"] = pr_document["_id"]
            await create_commit(commit_info)
        else:
            logger.warning(f"PR document not found for node_id: {pr_node_id}")
    
    return {"success": True, "msg": "Processed push event successfully"}

    except Exception as e: logger.error(f"Error processing push event: {str(e)}") return {"success": False, "msg": f"Error processing push event: {str(e)}"}

2: Update error handling in get_detailed_commit_info function

  1. Modify the get_detailed_commit_info function in app/utils/repo/github_utils.py:
    • Add error handling for API requests
    • Implement data validation for the response
    • Return None or a default structure if the required data is missing

Here's the implementation of these changes:

  1. Update the get_detailed_commit_info function in app/utils/repo/github_utils.py:

    • Add the following imports at the top of the file if not already present:
      from app.logger import logger
      from typing import Optional, Dict, Any
    • Modify the get_detailed_commit_info function as follows:
      
      async def get_detailed_commit_info(self, repo: str, commit_sha: str) -> Optional[Dict[str, Any]]:
      url = f"{self.base_url}/repos/{repo}/commits/{commit_sha}"
      headers = {"Authorization": f"Bearer {await self.get_access_token()}"}

    try: async with AsyncClient() as client: response = await client.get(url, headers=headers) response.raise_for_status() commit_data = response.json()

    if not isinstance(commit_data, dict):
        logger.error(f"Unexpected commit data format: {commit_data}")
        return None
    
    commit = commit_data.get("commit", {})
    author = commit.get("author", {})
    committer = commit.get("committer", {})
    
    return {
        "sha": commit_data.get("sha"),
        "tree_sha": commit_data.get("commit", {}).get("tree", {}).get("sha"),
        "comment_count": commit.get("comment_count", 0),
        "author": author.get("name"),
        "commit_time": committer.get("date"),
        "files_changed": [
            {
                "filename": file.get("filename"),
                "additions": file.get("additions", 0),
                "deletions": file.get("deletions", 0),
                "changes": file.get("changes", 0),
                "status": file.get("status")
            }
            for file in commit_data.get("files", [])
        ],
        "message": commit.get("message")
    }

    except Exception as e: logger.error(f"Error fetching commit details for {repo}/{commit_sha}: {str(e)}") return None

3: Add logging for debugging

  1. Add logging statements in the handle_push_event function:

    • Log the incoming payload
    • Log the results of key operations
  2. Add logging statements in the get_detailed_commit_info function:

    • Log the API request details
    • Log the response data

Here's the implementation of these changes:

  1. Update the handle_push_event function in app/routers/git_service.py:

    async def handle_push_event(payload: dict, installation_id: str, user, repo):
    logger.info(f"Received push event payload: {payload}")
    handler = GitHubCommitEvent(payload)
    ref = handler.ref
    if not ref.startswith("refs/heads/"):
        logger.info(f"Ignored push to non-branch ref: {ref}")
        return {"success": True, "msg": f"Ignored push to non-branch ref: {ref}"}
    
    repository = handler.repository_name
    
    github = Github(github_installation_id=installation_id, github_is_read_access=True)
    
    prs = await github.get_all_pull_requests(repository, params={
        "head": f"{handler.repository['owner']['name']}:{handler.branch_name}",
        "state": "open"
    })
    logger.info(f"Found {len(prs['pull_requests'])} open PRs for branch {handler.branch_name}")
    
    # Get the commit information
    commit_data = handler.head_commit
    if not commit_data:
        logger.warning("No commit data found in the payload")
        return {"success": True, "msg": "No commit data found in the payload"}
    
    logger.info(f"Processing commit: {commit_data['id']}")
    
    try:
        commit_info = await github.get_detailed_commit_info(repository, commit_data["id"])
    
        logger.info(f"Commit info: {commit_info}")
    
        for pr in prs["pull_requests"]:
            pr_node_id = pr["node_id"]
    
            pr_document = await get_pull_request_by_filter({ "pr_origin" : "GITHUB", "git_node_id": pr_node_id })
    
            if pr_document:
                commit_info["pr_id"] = pr_document["_id"]
                created_commit = await create_commit(commit_info)
                logger.info(f"Created commit entry: {created_commit}")
            else:
                logger.warning(f"PR document not found for node_id: {pr_node_id}")
    
        return {"success": True, "msg": "Processed push event successfully"}
    except Exception as e:
        logger.error(f"Error processing push event: {str(e)}", exc_info=True)
        return {"success": False, "msg": f"Error processing push event: {str(e)}"}
  2. Update the get_detailed_commit_info function in app/utils/repo/github_utils.py:

    async def get_detailed_commit_info(self, repo: str, commit_sha: str) -> Optional[Dict[str, Any]]:
    url = f"{self.base_url}/repos/{repo}/commits/{commit_sha}"
    headers = {"Authorization": f"Bearer {await self.get_access_token()}"}
    
    logger.info(f"Fetching commit details for {repo}/{commit_sha}")
    
    try:
        async with AsyncClient() as client:
            response = await client.get(url, headers=headers)
            response.raise_for_status()
            commit_data = response.json()
    
        logger.info(f"Received commit data: {commit_data}")
    
        if not isinstance(commit_data, dict):
            logger.error(f"Unexpected commit data format: {commit_data}")
            return None
    
        commit = commit_data.get("commit", {})
        author = commit.get("author", {})
        committer = commit.get("committer", {})
    
        result = {
            "sha": commit_data.get("sha"),
            "tree_sha": commit_data.get("commit", {}).get("tree", {}).get("sha"),
            "comment_count": commit.get("comment_count", 0),
            "author": author.get("name"),
            "commit_time": committer.get("date"),
            "files_changed": [
                {
                    "filename": file.get("filename"),
                    "additions": file.get("additions", 0),
                    "deletions": file.get("deletions", 0),
                    "changes": file.get("changes", 0),
                    "status": file.get("status")
                }
                for file in commit_data.get("files", [])
            ],
            "message": commit.get("message")
        }
    
        logger.info(f"Processed commit info: {result}")
        return result
    except Exception as e:
        logger.error(f"Error fetching commit details for {repo}/{commit_sha}: {str(e)}", exc_info=True)
        return None

4: Update GitHubCommitEvent class to handle missing data

  1. Modify the GitHubCommitEvent class in app/utils/repo/github_utils.py:
    • Add error handling for missing or invalid data in the payload
    • Provide default values for required fields
    • Log warnings for missing data

Here's the implementation of these changes:

  1. Update the GitHubCommitEvent class in app/utils/repo/github_utils.py:

    class GitHubCommitEvent:
    def __init__(self, payload: dict):
        self.payload = payload
        self.ref = payload.get('ref', '')
        self.repository = payload.get('repository', {})
        self.head_commit = payload.get('head_commit')
    
        if not self.repository:
            logger.warning("Repository information missing in the payload")
    
        if not self.head_commit:
            logger.warning("Head commit information missing in the payload")
    
        self.branch_name = self.ref.split('/')[-1] if self.ref.startswith('refs/heads/') else ''
    
        if not self.branch_name:
            logger.warning(f"Unable to determine branch name from ref: {self.ref}")
    
    @property
    def repository_name(self) -> str:
        return self.repository.get('full_name', '')
    
    @property
    def commit_sha(self) -> str:
        return self.head_commit.get('id', '') if self.head_commit else ''
    
    @property
    def commit_message(self) -> str:
        return self.head_commit.get('message', '') if self.head_commit else ''
    
    @property
    def commit_author(self) -> dict:
        return self.head_commit.get('author', {}) if self.head_commit else {}
    
    @property
    def commit_timestamp(self) -> str:
        return self.head_commit.get('timestamp', '') if self.head_commit else ''
    
    def get_commit_files(self) -> List[dict]:
        if not self.head_commit:
            return []
        return self.head_commit.get('added', []) + self.head_commit.get('removed', []) + self.head_commit.get('modified', [])
    
    def __str__(self) -> str:
        return f"GitHubCommitEvent(repo={self.repository_name}, branch={self.branch_name}, sha={self.commit_sha})"

5: Add unit tests for handle_push_event function

  1. Create a new test file for git_service.py:
    • Create a new file named test_git_service.py in the tests directory
    • Import necessary modules and functions
    • Write unit tests for the handle_push_event function

Here's the implementation of these changes:

  1. Create a new file named test_git_service.py in the tests directory:
    
    import pytest
    from unittest.mock import AsyncMock, patch
    from app.routers.git_service import handle_push_event
    from app.utils.repo.github_utils import Github, GitHubCommitEvent

@pytest.fixture def mock_payload(): return { "ref": "refs/heads/main", "repository": { "full_name": "test/repo", "owner": {"name": "test"} }, "head_commit": { "id": "abc123", "message": "Test commit", "author": {"name": "Test Author"}, "timestamp": "2023-05-01T12:00:00Z" } }

@pytest.fixture def mock_user(): return {"_id": "user123"}

@pytest.fixture def mock_repo(): return {"_id": "repo123"}

@pytest.mark.asyncio async def test_handle_push_event_success(mock_payload, mock_user, mock_repo): with patch('app.routers.git_service.Github', new_callable=AsyncMock) as MockGithub, \ patch('app.routers.git_service.get_pull_request_by_filter', new_callable=AsyncMock) as mock_get_pr, \ patch('app.routers.git_service.create_commit', new_callable=AsyncMock) as mock_create_commit:

    mock_github_instance = MockGithub.return_value
    mock_github_instance.get_all_pull_requests.return_value = {
        "pull_requests": [{"node_id": "pr123"}]
    }
    mock_github_instance.get_detailed_commit_info.return_value = {
        "sha": "abc123",
        "tree_sha": "tree123",
        "comment_count": 0,
        "author": "Test Author",
        "commit_time": "2023-05-01T12:00:00Z",
        "files_changed": [],
        "message": "Test commit"
    }

    mock_get_pr.return_value.return_value = {"_id": "pr123", "pr_origin": "GITHUB", "git_node_id": "pr123"}
    mock_create_commit.return_value = {"_id": "commit123"}

    result = await handle_push_event(mock_payload, "installation123", mock_user, mock_repo)

    assert result == {"success": True, "msg": "Processed push event successfully"}
    MockGithub.assert_called_once_with(github_installation_id="installation123", github_is_read_access=True)
    mock_github_instance.get_all_pull_requests.assert_called_once_with(
        "test/repo",
        params={"head": "test:main", "state": "open"}
    )
    mock_github_instance.get_detailed_commit_info.assert_called_once_with("test/repo", "abc123")
    mock_get_pr.assert_called_once_with({"pr_origin": "GITHUB", "git_node_id": "pr123"})
    mock_create_commit.assert_called_once()

@pytest.mark.asyncio async def test_handle_push_event_no_commit_data(mock_payload, mock_user, mock_repo): mock_payload["head_commit"] = None result = await handle_push_event(mock_payload, "installation123", mock_user, mock_repo) assert result == {"success": True, "msg": "No commit data found in the payload"}

@pytest.mark.asyncio async def test_handle_push_event_non_branch_push(mock_payload, mock_user, mock_repo): mock_payload["ref"] = "refs/tags/v1.0" result = await handle_push_event(mock_payload, "installation123", mock_user, mock_repo) assert result == {"success": True, "msg": "Ignored push to non-branch ref: refs/tags/v1.0"}

@pytest.mark.asyncio async def test_handle_push_event_error(mock_payload, mock_user, mock_repo): with patch('app.routers.git_service.Github', new_callable=AsyncMock) as MockGithub: mock_github_instance = MockGithub.return_value mock_github_instance.get_all_pull_requests.side_effect = Exception("API Error")

    result = await handle_push_event(mock_payload, "installation123", mock_user, mock_repo)

    assert result == {"success": False, "msg": "Error processing push event: API Error"}

Add more test cases as needed to cover different scenarios and edge cases


These unit tests cover the main scenarios for the `handle_push_event` function, including successful execution, handling missing commit data, non-branch pushes, and error cases. You can add more test cases to cover additional scenarios and edge cases as needed.

To run these tests, you'll need to have pytest and pytest-asyncio installed. You can run the tests using the command:

pytest tests/test_git_service.py



Make sure to adjust the import paths if necessary, depending on your project structure.
dev-archie-ai-code-explain-pr[bot] commented 1 month ago

๐Ÿ” Processing Issue

Steps:

All steps have been completed.

dev-archie-ai-code-explain-pr[bot] commented 1 month ago

๐Ÿ” Processing Issue

Steps:

All steps have been completed.

dev-archie-ai-code-explain-pr[bot] commented 1 month ago

๐Ÿ” Processing Issue

Steps:

All steps have been completed.

dev-archie-ai-code-explain-pr[bot] commented 1 month ago

๐Ÿ” Processing Issue

Steps:

All steps have been completed.

dev-archie-ai-code-explain-pr[bot] commented 1 month ago

โœ… Completed Processing Issue

Steps:

All steps have been completed.

dev-archie-ai-code-explain-pr[bot] commented 1 month ago

Processing Complete (PR Creation Failed)

Steps:

Processing is complete, but PR Creation failed.

dev-archie-ai-code-explain-pr[bot] commented 1 month ago

Processing Complete (PR Creation Failed)

Steps:

Processing is complete, but PR Creation failed.

dev-archie-ai-code-explain-pr[bot] commented 1 month ago

Processing Complete (PR Creation Failed)

Steps:

Processing is complete, but PR Creation failed.

dev-archie-ai-code-explain-pr[bot] commented 1 month ago

โœ… Completed Processing Issue

Steps:

All steps have been completed successfully.

ajitesh123 commented 1 month ago

/help

ajitesh123 commented 4 days ago

test

ajitesh123 commented 4 days ago

test

ajitesh123 commented 4 days ago

test

ajitesh123 commented 4 days ago

autocoder

ajitesh123 commented 4 days ago

autocoder

ajitesh123 commented 4 days ago

autocoder

ajitesh123 commented 4 days ago

autocoder

ajitesh123 commented 4 days ago

autocoder

ajitesh123 commented 4 days ago

autocoder

ajitesh123 commented 4 days ago

test

ajitesh123 commented 4 days ago

test

ajitesh123 commented 4 days ago

autocoder

ajitesh123 commented 4 days ago

Test