creightontaylor / simple_website

0 stars 0 forks source link

Generate Daily GitHub Summary Report #25

Open creightontaylor opened 4 months ago

creightontaylor commented 4 months ago

Description:

Create a simple automated task that demonstrates Sif Task Force's ability to handle repetitive development tasks. The task involves generating a daily summary report from a GitHub repository, including the number of commits, open issues, and closed issues. The report should be formatted in markdown and saved in the repository.

Background/Context:

This demo aims to showcase the basic automation capabilities of Sif Task Force, allowing developers to see how the tool can offload routine tasks. The focus should be on simplicity and clear demonstration of the automation process.

Task Details:

Task Name: Generate Daily GitHub Summary Report Frequency: Daily Repository: https://github.com/creightontaylor/simple_website Content of the Report: Date of the report Number of commits made on that day List of commits with commit messages and author names Number of open issues Number of closed issues Format: Markdown Destination: Save the report as daily_summary.md in the root of the repository. Steps to Implement:

Setup Task:

Configure the task to run daily at a specified time. Ensure the task can authenticate and access the provided GitHub repository. Data Collection:

Fetch the number of commits made on the day. Retrieve commit messages and author names. Count the number of open issues. Count the number of closed issues. Report Generation:

Format the collected data into a markdown report. Include the date, commit count, commit details, and issue counts. Saving the Report:

Save the generated markdown report as daily_summary.md in the root directory of the repository. Acceptance Criteria:

Report Accuracy:

The report should accurately reflect the number of commits, open issues, and closed issues for the day. Report Formatting:

The report should be well-formatted in markdown. Automation:

The task should run automatically at the specified time each day without manual intervention. Saving the Report:

The report should be correctly saved in the specified location within the repository. User Stories:

As a developer, I want to see how Sif Task Force can automate daily reporting tasks so that I can save time on routine activities. As a project manager, I want to receive a daily summary report of the repository to stay updated on the project's progress without manually checking GitHub.

sif-taskforce commented 4 months ago

1. πŸ“ Indexing

I'm indexing the files within your repository.

Successfully indexed!

2. πŸ”Ž Searching

I'm searching for relevant snippets in your repository.

It seems there was a misunderstanding. I can't directly execute or set up automation tasks, access external URLs, or interact with GitHub repositories beyond providing guidance based on the information you've shared. However, I can guide you through the steps and the kind of script you might need to write to accomplish this task.

  1. Authentication with GitHub API: You'll need a GitHub token to authenticate requests. Store this securely and use it to authenticate your API requests.

  2. Fetching Commits:

    • Use the GitHub API to fetch commits for the specified repository. The endpoint might look something like GET /repos/:owner/:repo/commits.
    • Filter these by the current date to get the day's commits.
  3. Counting Issues:

    • Use the GitHub API to fetch issues. You can find open issues with GET /repos/:owner/:repo/issues?state=open and closed issues with GET /repos/:owner/:repo/issues?state=closed.
    • Filter these by the date to count the number of issues opened or closed on that day.
  4. Formatting the Report:

    • Use Markdown syntax to format your report. Include sections for the date, commit count and details, and the count of open and closed issues.
  5. Saving the Report:

    • To save the report as daily_summary.md in the root directory, you might need to use the GitHub API to create or update a file in the repository. The endpoint could be something like PUT /repos/:owner/:repo/contents/daily_summary.md.
  6. Automation:

    • This script can be automated using a cron job or a GitHub Action to run at a specified time daily.
    • If using GitHub Actions, you can set up a workflow with a schedule event in the .github/workflows directory.

Here's a very basic outline of what the script might look like in pseudocode:

import requests
from datetime import datetime

# Authentication
token = "YOUR_GITHUB_TOKEN"
headers = {"Authorization": f"token {token}"}

# Fetch commits
today = datetime.now().strftime('%Y-%m-%d')
commits_response = requests.get("https://api.github.com/repos/creightontaylor/simple_website/commits?since=" + today, headers=headers)
commits_data = commits_response.json()

# Count and detail commits
commit_count = len(commits_data)
commit_details = [(commit["commit"]["author"]["name"], commit["commit"]["message"]) for commit in commits_data]

# Fetch and count issues
open_issues_response = requests.get("https://api.github.com/repos/creightontaylor/simple_website/issues?state=open", headers=headers)
closed_issues_response = requests.get("https://api.github.com/repos/creightontaylor/simple_website/issues?state=closed", headers=headers)
open_issues_count = len(open_issues_response.json())
closed_issues_count = len(closed_issues_response.json())

# Format report
report_content = f"""
# Daily Summary Report for {today}

## Commits
Total: {commit_count}
Details:
"""
for author, message in commit_details:
    report_content += f"- {message} by {author}\n"

report_content += f"""
## Issues
- Open: {open_issues_count}
- Closed: {closed_issues_count}
"""

# Save report
# You'll need to use the GitHub API to create or update the daily_summary.md file with `report_content`

This script is a starting point. You'll need to refine it, especially the parts that involve date filtering for issues and the final step of saving the report to the repository. Remember to handle errors and edge cases, such as days with no commits or issues.

View Snippets

3. 🌐 Planning

I'm creating a plan for the discussion and coding steps.

πŸ’‘ Steps

Step 1 - Create virtual environment

5. 🎲 Assignment

Here is the finalized workflow:

Create virtual environment **Action:** `create` **File:** `create_virtual_environment.sh` **Lines:** `[1, 3]` **Dependencies:** `[]` **Instructions:** `Write a shell script to create a Python virtual environment. First line should navigate to the root directory. Second line should execute 'python3 -m venv venv' to create the virtual environment named 'venv'.` **Agent:** `Architect`
Activate virtual environment **Action:** `modify` **File:** `activate_virtual_environment.sh` **Lines:** `[1, 1]` **Dependencies:** `['Step 1']` **Instructions:** `Write a shell script to activate the virtual environment. The script should contain the command 'source venv/bin/activate' to activate the virtual environment.` **Agent:** `Architect`
Install dependencies **Action:** `create` **File:** `requirements.txt` **Lines:** `[1, 5]` **Dependencies:** `['Step 2']` **Instructions:** `Create a 'requirements.txt' file listing all necessary Python packages such as 'requests' for making API calls and 'python-dotenv' for managing environment variables. Each package should be on a new line.` **Agent:** `Architect`
Setup GitHub secrets **Action:** `create` **File:** `setup_github_secrets.md` **Lines:** `[1, 10]` **Dependencies:** `[]` **Instructions:** `Create a markdown file with instructions on how to set up GitHub secrets for storing the GitHub token securely. Include steps to navigate to the repository settings, secrets section, and how to add a new secret named 'GITHUB_TOKEN'.` **Agent:** `Architect`
Create script for generating report **Action:** `create` **File:** `generate_report.py` **Lines:** `[1, 50]` **Dependencies:** `['Step 3', 'Step 4']` **Instructions:** `Write a Python script named 'generate_report.py' that uses the GitHub API to fetch the required data. Include error handling with try-except blocks, check the rate limit before making API calls, and use environment variables to access the GitHub token. The script should generate a markdown file named 'daily_summary.md' with the date, commit details, and issue counts.` **Agent:** `Architect`
Schedule script execution **Action:** `create` **File:** `schedule_script.sh` **Lines:** `[1, 3]` **Dependencies:** `['Step 5']` **Instructions:** `Create a shell script named 'schedule_script.sh' to schedule the daily execution of 'generate_report.py' using cron. Include a cron job setup that runs the script at a specified time each day.` **Agent:** `Architect`
Update README **Action:** `modify` **File:** `README.md` **Lines:** `[1, 20]` **Dependencies:** `['Step 1', 'Step 2', 'Step 3', 'Step 4', 'Step 5', 'Step 6']` **Instructions:** `Update the 'README.md' file to include documentation for the new automated daily summary report feature. Describe the purpose of the feature, how to set up and run the scripts, including setting up the virtual environment, installing dependencies, setting up GitHub secrets, and scheduling the script execution.` **Agent:** `Architect`
sif-taskforce commented 4 months ago

To enhance security, error handling, and efficiency in the proposed workflow, I recommend the following improvements:

  1. Security Enhancement: Modify the authentication script to use environment variables or GitHub secrets for storing the GitHub token instead of hardcoding it. This approach minimizes security risks by ensuring the token is not exposed in the script's source code.
  2. Error Handling: Incorporate try-except blocks in all scripts making API requests to gracefully handle potential API errors or network issues. This will improve the robustness of the workflow by preventing unhandled exceptions from causing failures.
  3. Rate Limiting: Implement a check for the GitHub API's rate limit status at the beginning of each script that makes API requests. If the rate limit is close to being exceeded, the script should wait or log a warning instead of proceeding with the request. This prevents the workflow from being unexpectedly halted due to rate limit violations.
  4. Dependency Management: Use a virtual environment for the project to manage dependencies more effectively. This isolates the project's dependencies from the global Python environment, reducing the risk of conflicts and making it easier to replicate the project setup.
  5. Integration and Workflow Compatibility: Ensure that data exchange between scripts is handled through well-defined interfaces or shared data formats. This could involve using JSON files or Python modules for sharing data, which would facilitate easier integration and maintenance.
  6. Documentation and Maintainability: Finally, ensure comprehensive documentation is provided for each script and the overall workflow. This includes comments in the code explaining the purpose and logic of each part, as well as a README file detailing how to set up and run the workflow. Implementing these suggestions will not only address the current concerns but also enhance the overall quality and maintainability of the workflow.