Compared to BART model, which usually took 5 mins to do the summarization, GPT model is an extremely efficient tool, which only need 20 sec to do so. Therefore, we decided to use GPT model to run the news.
To prevent the function from running and using the API key unless we explicitly want to test it, I included a condition that checks whether we're testing the function or not.
line 113: test_function = False
Input:
5 articles in a list from Crawler
example: ["article 1", "article 2", "article 3", "article 4", "article 5"]
Output:
5 summaries in a list to Frontend
example: ["summary 1", "summary 2", "summary 3", "summary 4", "summary 5"]
Summary:
Compared to BART model, which usually took 5 mins to do the summarization, GPT model is an extremely efficient tool, which only need 20 sec to do so. Therefore, we decided to use GPT model to run the news.
To prevent the function from running and using the API key unless we explicitly want to test it, I included a condition that checks whether we're testing the function or not. line 113: test_function = False
Input: 5 articles in a list from Crawler example: ["article 1", "article 2", "article 3", "article 4", "article 5"]
Output: 5 summaries in a list to Frontend example: ["summary 1", "summary 2", "summary 3", "summary 4", "summary 5"]