ppy / osu-web

the browser-facing portion of osu!
https://osu.ppy.sh
GNU Affero General Public License v3.0
983 stars 386 forks source link

osu forums get request issue #8227

Closed SeaSaltSmiles closed 3 years ago

SeaSaltSmiles commented 3 years ago

Been working on my osu discord bot and wanted to add a command where it will return the titles of the most recent posts in the tournaments forum topic. Looking at the api documentation it doesn't seem possible at the moment. [You can see the whole code here]

The URL /forums/topics/55 leads me to this post and not here but just forums/55 (as it shows in the tournaments link) the code prints the error {'error': 'Invalid url or incorrect request method.'}

Is there a way to get the data from the tournaments forum? Hopfully this makes sense.

def tournament_updates():
  token = get_token()

  headers = {
    'Content-Type': 'application/json', 
    'Accept': 'application/json', 
    'Authorization': f'Bearer {token}'
  }

  params = {
  }

  response = requests.get(f'{API_URL}/forums/55', params=params, headers=headers)

  json_data = json.loads(response.text)
  print(json_data)

tournament_updates()
gagahpangeran commented 3 years ago

7440 #7486

SeaSaltSmiles commented 3 years ago

Figured out a work around using a web scraper. Thought I should post to help anyone else until its added properly.

import requests
from bs4 import BeautifulSoup

link_start = "https://osu.ppy.sh/community/forums/topics/"
start_unread = "?start=unread"

#fetch link toueney
page = requests.get('https://osu.ppy.sh/community/forums/55')
#Get HTML content
soup = BeautifulSoup(page.text, 'html.parser')

#Save all href links in a list
links_list = []
links = soup.find_all('a')
for link in links:
    fullLink = link.get('href')
    if link_start in fullLink:
      if fullLink in links_list:
        pass  
      if start_unread in fullLink:
        links_list.append(fullLink)

def three_tourney():
    return(links_list[-30], links_list[-29], links_list[-28])
peppy commented 3 years ago

Closing as duplicate of linked issues above.