Open akumar1903 opened 1 year ago
Here's an example that pulls a table into a pandas dataframe. I used response.links['next']['url'] to get the next url.
import requests import pandas as pd import os
session = requests.Session()
url = 'https://api.enverus.com/v3/direct-access/tokens'
secret_key = '
headers['Authorization'] = "Bearer {}".format(token)
url = "https://api.enverus.com/v3/direct-access/" dataset = 'rigs' query_url = os.path.join(url, dataset) headers['Authorization'] = "Bearer {}".format(token) params = dict(deleteddate="null", pagesize=100000, ENVBasin = ('DELAWARE'))
response = session.get(query_url, headers = headers, params = params) df = pd.DataFrame(response.json())
df_length = 1 while df_length > 0:
response = session.get(url[:-1] + response.links['next']['url'], headers = headers)
df_response = pd.DataFrame(response.json())
df = pd.concat([df, df_response])
df_length = len(df_response)
i can do response.headers and response.links in normal Python code. What should be done here if i need pages more than 100000?