zepor / NFLSTATS

MIT License
2 stars 2 forks source link

"Daily Change Log – Provides IDs and timestamps for teams, players, game statistics, schedules, and standings that have been modified on a given date. To receive the data updates, use these unique IDs to pull relevant API feeds." 20 Twice a Day 11AM and 11PM to add the new items. New items will cause need for Game Stat, PlayerGameStat API to need to re-trigger based on values returned Ad-Hoc #49

Open zepor opened 9 months ago

codeautopilot[bot] commented 9 months ago

Potential solution

The task involves integrating a change log system that captures modifications to teams, players, game statistics, schedules, and standings, and then using this information to trigger updates in various APIs and display the change log in the frontend dashboard. The solution will require changes across multiple files to ensure that the system correctly logs changes, triggers updates, and displays the information to the users.

Code

For backend-container/src/models/changelog.py:

# No specific code changes are proposed here since the ChangelogEntry model already contains the necessary fields.
# However, if additional details are needed, the model can be extended as described in the proposal.

For backend-container/src/sportsradar/extract/playerfeeds.py:

# Add the following method to the PlayerFeeds class
def update_player_profiles(self, changelog_model, api_key):
    new_changes = changelog_model.get_new_changes(entity_type='player')
    for change in new_changes:
        player_id = change['entity_id']
        try:
            self.get_player_profile(
                access_level='...',
                version='...',
                language_code='...',
                player_id=player_id,
                file_format='...',
                api_key=api_key
            )
            logger.info(f"Updated player profile for player ID: {player_id}")
        except Exception as e:
            logger.error(f"Failed to update player profile for player ID: {player_id}: {e}")

For backend-container/src/bpscheduler/bpschedule.py:

# Add the following code to schedule the task
from apscheduler.schedulers.background import BackgroundScheduler
from changelog import generate_change_log_entries

def scheduled_task():
    print(f"Generating change log entries at {datetime.now()}")
    generate_change_log_entries()
    print("Change log entries generated successfully.")

scheduler = BackgroundScheduler()
scheduler.add_job(scheduled_task, 'cron', hour='11,23')
scheduler.start()

For backend-container/src/sportsradar/extract/gamefeeds.py:

# Add the following methods to the GameFeeds class
def check_for_updates(self, game_id):
    # Implement the logic to check for updates in the change log
    pass

def update_game_data(self, game_id):
    if self.check_for_updates(game_id):
        boxscore = self.get_game_boxscore(...)
        roster = self.get_game_roster(...)
        statistics = self.get_game_statistics(...)
        pbp = self.get_game_pbp(...)
        # Process and store the fetched data
        pass
    else:
        logger.info(f"No update needed for game ID {game_id}.")

For frontend-container/src/pages/dashboards/Default/index.jsx:

// Import the new ChangeLog component
import ChangeLog from "./ChangeLog";

// Add the ChangeLog component to the dashboard layout
const Default = () => (
  // ... existing code ...
  <Row>
    <Col className="d-flex">
      <ChangeLog /> {/* Add the ChangeLog component to the dashboard */}
    </Col>
  </Row>
  // ... existing code ...
);

For backend-container/src/apimappings/SeasonalStats.py:

from src.models.changelog import ChangeLog

def save_to_database(mapped_seasons, mapped_players, opponenetseasondata, teamseasondata, playerseasondata, team_info_dict):
    for entry_id, mapped_entry in mapped_data.items():
        last_change = ChangeLog.objects(entity_id=entry_id).order_by('-timestamp').first()
        if last_change and last_change.timestamp > existing_entry.last_updated:
            # Update or add records
        else:
            logging.info(f"No changes found for {collection_name} with id {entry_id}. Skipping update.")

These code snippets provide a high-level overview of the changes needed to implement the solution. Each snippet should be adapted to fit the specific details of the application, such as the actual parameters for API calls, the logic for checking the change log, and the integration with the frontend component. Additionally, thorough testing is required to ensure that the system behaves as expected after these changes are made.

Changes on backend-container/src/models/changelog.py The `ChangelogEntry` model in the `changelog.py` file already seems to include the necessary fields to satisfy the requirements of the task. It has fields for `entity_id`, `entity_type`, `changes`, and `timestamp`, which align with the ticket description that asks for IDs, timestamps, and types of changes. Here is a practical proposal to ensure the `ChangelogEntry` model meets the requirements: 1. Review the `changes` field to ensure it can capture all necessary details about what was modified. For instance, it should be able to store before and after values or just the new values if that's sufficient for the application's needs. 2. If the `changes` field needs to capture more detailed information, consider extending it to include nested fields or additional metadata. For example, you could have a `before` and `after` subfield for each changed property, or include a `change_type` to indicate if the change was an addition, update, or deletion. 3. Ensure that the `timestamp` field is being set correctly at the time of the change. This field should reflect the exact time when the change was made. 4. If there are any additional requirements for the changelog entries that are not currently captured by the model, such as a user ID who made the change or a reason for the change, add the necessary fields to the model. 5. Update any code that creates `ChangelogEntry` instances to ensure that all the new fields are being populated correctly. 6. Write unit tests to validate that the `ChangelogEntry` model works as expected, especially after making any modifications. 7. Update any documentation related to the `ChangelogEntry` model to reflect the changes made. Since the current model already includes the necessary fields, there may be no need for immediate changes unless further details are required by the application's logic. If the current fields are sufficient, you can proceed with implementing the other parts of the task that interact with the `ChangelogEntry` model.
Changes on backend-container/src/sportsradar/extract/playerfeeds.py To modify the `PlayerFeeds` class to trigger updates to the PlayerGameStat API when new change log entries are detected for player-related data, we need to implement a method that checks for changes in the change log and then calls the `get_player_profile` method if necessary. Here's a practical proposal for the changes: 1. Add a new method to the `PlayerFeeds` class that will check for new change log entries related to player data. This method will interact with the change log model to fetch the latest changes. 2. If new changes are detected, the method will iterate over the change log entries and call the `get_player_profile` method for each player ID that has been modified. 3. Ensure that the `get_player_profile` method is capable of handling updates and not just retrievals. This might involve adding additional logic to update existing records in the database with the new data. 4. Implement error handling and logging to ensure that any issues during the update process are properly reported. Here is a code snippet that outlines the new method: ```python class PlayerFeeds: # ... existing code ... def update_player_profiles(self, changelog_model, api_key): """ Check for new change log entries and update player profiles if necessary. :param changelog_model: The model used to interact with the change log. :param api_key: The API key for authentication. """ # Fetch new change log entries for players new_changes = changelog_model.get_new_changes(entity_type='player') # Iterate over new changes and update player profiles for change in new_changes: player_id = change['entity_id'] try: self.get_player_profile( access_level='...', # specify the access level version='...', # specify the version language_code='...', # specify the language code player_id=player_id, file_format='...', # specify the file format api_key=api_key ) logger.info(f"Updated player profile for player ID: {player_id}") except Exception as e: logger.error(f"Failed to update player profile for player ID: {player_id}: {e}") # Note: Replace '...' with the actual parameters required for the API call. ``` In this snippet, `changelog_model` is a placeholder for the actual change log model that will be used to fetch new changes. The `get_new_changes` method is assumed to be a method of the change log model that returns a list of new change log entries for a given entity type (in this case, 'player'). Remember to replace the placeholder values for `access_level`, `version`, `language_code`, and `file_format` with the actual parameters required for the API call. This code assumes that the change log model and its methods are already implemented and available for use. If they are not, you will need to implement them as part of the overall task.
Changes on backend-container/src/bpscheduler/bpschedule.py To create a scheduled task that runs twice daily at 11AM and 11PM to generate and store new change log entries, we need to use a scheduling library such as `apscheduler`. Here's a practical proposal to implement this in the `bpschedule.py` file: 1. Import the necessary modules from `apscheduler`. 2. Define a function that will be responsible for generating and storing the change log entries. 3. Schedule this function to run at 11AM and 11PM every day. Here's how the code could look after the changes: ```python import sys import os from apscheduler.schedulers.background import BackgroundScheduler from datetime import datetime # Other imports remain the same # Assuming there's a function to generate and store change log entries # This function should be implemented in the appropriate module from changelog import generate_change_log_entries def scheduled_task(): """ This function is called by the scheduler to generate and store change log entries. """ print(f"Generating change log entries at {datetime.now()}") generate_change_log_entries() print("Change log entries generated successfully.") # Initialize the scheduler scheduler = BackgroundScheduler() # Schedule the task to run twice daily at 11AM and 11PM scheduler.add_job(scheduled_task, 'cron', hour='11,23') # Start the scheduler scheduler.start() # Rest of the existing code... ``` In this code snippet, we've added the `BackgroundScheduler` from `apscheduler.schedulers.background` and scheduled the `scheduled_task` function to run at 11AM and 11PM every day using a cron trigger with the hours set to '11,23'. Please note that the `generate_change_log_entries` function is a placeholder and should be implemented in the appropriate module (possibly in `changelog.py` or another related module). This function should handle the logic for generating and storing the change log entries based on the data updates. Make sure to add the necessary error handling and logging to the `scheduled_task` function to handle any exceptions and to keep a record of when the task runs and whether it completes successfully.
Changes on backend-container/src/sportsradar/extract/gamefeeds.py To address the task of modifying the `GameFeeds` class to trigger updates to the Game Stat API when new change log entries are detected for game-related data, we need to implement a mechanism that checks for changes and then calls the appropriate update functions. Here's a practical proposal for how to achieve this: 1. **Integrate Change Log Checking**: We need to add a method to the `GameFeeds` class that checks for new entries in the change log that pertain to game-related data. This method should be called before fetching any game data to determine if an update is necessary. 2. **Triggering Updates**: If the change log indicates that there have been updates to game-related data, the `GameFeeds` class should call the appropriate methods to fetch the latest data from the SportsRadar API. 3. **Caching or Storing Data**: To avoid unnecessary API calls, we can implement caching or a storage mechanism that keeps track of the last update time for each game. This way, we only fetch data when there's a new change log entry after the last update time. Here is a conceptual implementation outline for the `GameFeeds` class: ```python from datetime import datetime class GameFeeds: # Existing methods... def check_for_updates(self, game_id): """ Check the change log for updates related to the given game_id. :param game_id: The ID of the game to check for updates. :return: Boolean indicating whether an update is needed. """ # This is a placeholder for the actual change log checking logic. # You would need to implement the logic to interact with the change log system. # For example, you could query a database or an API to get the latest change log entries. # Then, compare the timestamps to determine if there's a newer update since the last check. # Return True if an update is needed, False otherwise. pass def update_game_data(self, game_id): """ Update the game data by fetching the latest information from the API. :param game_id: The ID of the game to update. """ # Check if updates are needed for the game. if self.check_for_updates(game_id): # Fetch the latest data from the API. boxscore = self.get_game_boxscore(...) # Add the required parameters roster = self.get_game_roster(...) statistics = self.get_game_statistics(...) pbp = self.get_game_pbp(...) # Process and store the fetched data as needed. # This could involve updating a database or sending data to another service. # You would implement the specific logic based on your application's requirements. pass else: # Log that no update is needed. logger.info(f"No update needed for game ID {game_id}.") # Existing methods would remain unchanged... ``` In this outline, the `check_for_updates` method is a placeholder where you would implement the actual logic to interact with the change log system. The `update_game_data` method uses this check to decide whether to fetch and process new data. Please note that this is a high-level proposal and the actual implementation would require more details about how the change log system works, how updates should be processed, and how the data should be stored or cached.
Changes on frontend-container/src/pages/dashboards/Default/index.jsx To solve the task for the `frontend-container/src/pages/dashboards/Default/index.jsx` file, you will need to create a new component that fetches and displays the daily change log, and then integrate this component into the existing dashboard layout. Here's a step-by-step plan: 1. **Create the ChangeLog Component:** - Create a new file for the ChangeLog component, for example, `ChangeLog.jsx`. - In this component, implement the logic to fetch the change log data from the backend. This could be done using an API call to the backend service responsible for providing the change log data. - Display the fetched data in a user-friendly format, such as a table or list. Each entry should include the entity ID, timestamp, and type of change. 2. **Integrate the ChangeLog Component into the Dashboard:** - Import the ChangeLog component into the `Default` dashboard component. - Decide where on the dashboard the change log should be displayed. For example, you might want to add it as a new row or as a new column within an existing row. - Add the ChangeLog component to the dashboard's JSX layout. Ensure that it is responsive and fits well with the existing design. 3. **Update the Dashboard Layout:** - Adjust the layout of the dashboard to accommodate the new ChangeLog component. This might involve modifying the `Row` and `Col` components from React Bootstrap to create a new section for the change log. - Ensure that the new component does not disrupt the existing layout and is accessible to users. 4. **Test the Integration:** - Test the dashboard to ensure that the ChangeLog component is fetching and displaying data correctly. - Check the responsiveness of the dashboard with the new component on different screen sizes. - Verify that the change log updates at the specified times (11AM and 11PM) and that the data is current. Here's an example of how you might modify the `Default` component to include the ChangeLog component: ```jsx import React from "react"; import { Helmet } from "react-helmet-async"; import { Container, Row, Col } from "react-bootstrap"; import Header from "./Header"; import Appointments from "./Appointments"; import BarChart from "./BarChart"; import Calendar from "./Calendar"; import Feed from "./Feed"; import PieChart from "./PieChart"; import Projects from "./Projects"; import Statistics from "./Statistics"; import ChangeLog from "./ChangeLog"; // Import the new ChangeLog component const Default = () => (
{/* Add the ChangeLog component to the dashboard */} ); export default Default; ``` Remember to style the ChangeLog component to match the dashboard's design and ensure that it is user-friendly and accessible.
Changes on backend-container/src/apimappings/SeasonalStats.py The `SeasonalStats.py` file is responsible for fetching and saving seasonal statistics from an external API and storing them in a database. The task at hand is to ensure that the API mappings handle the new change log entries and trigger updates where necessary. To address this task, we need to modify the `save_to_database` function to check for change log entries before updating or adding new records. Here's a practical proposal for how to achieve this: 1. **Integrate Change Log Model**: Import the `ChangeLog` model and ensure it is properly integrated into the `save_to_database` function. This model should contain fields for entity IDs, timestamps, and types of changes. 2. **Check for Changes**: Before updating or adding a new record, check the `ChangeLog` to see if there are any relevant entries indicating that the data has changed. This can be done by querying the `ChangeLog` with the entity ID and the timestamp of the last update. 3. **Conditional Update**: If the `ChangeLog` indicates that there have been changes since the last update, proceed with the update or addition of the new record. Otherwise, skip the update for that particular record. 4. **Logging Changes**: Enhance the logging within the `save_to_database` function to include information about whether a record was updated based on a change log entry or skipped due to no changes. 5. **Optimize Performance**: Since the change log could potentially grow large, consider indexing the `ChangeLog` collection by entity ID and timestamp to optimize the performance of lookups. 6. **Error Handling**: Ensure that there is robust error handling around the change log integration so that if there is an issue with the change log, it does not prevent the entire update process from proceeding. 7. **Testing**: After implementing the changes, thoroughly test the updated `save_to_database` function to ensure that it correctly handles change log entries and updates the database as expected. Here is a pseudo-code snippet to illustrate how the change log check might be integrated into the `save_to_database` function: ```python from src.models.changelog import ChangeLog def save_to_database(mapped_seasons, mapped_players, opponenetseasondata, teamseasondata, playerseasondata, team_info_dict): # ... existing code ... for entry_id, mapped_entry in mapped_data.items(): # ... existing code ... # Check for changes in the ChangeLog last_change = ChangeLog.objects(entity_id=entry_id).order_by('-timestamp').first() if last_change and last_change.timestamp > existing_entry.last_updated: # Proceed with update or addition # ... existing code for updating or adding records ... else: # Skip update as there are no new changes logging.info(f"No changes found for {collection_name} with id {entry_id}. Skipping update.") # ... existing code ... ``` By following these steps, the `SeasonalStats.py` file will be able to handle new change log entries and trigger updates to the database only when necessary, ensuring that the system remains efficient and up-to-date.