vusaverse / vvtableau

Functions to communicate with Tableau services
https://vusaverse.github.io/vvtableau/
Other
6 stars 3 forks source link

Adding filter argument to download_tableau_data function #8

Open chasebinns27 opened 3 months ago

chasebinns27 commented 3 months ago

It would be very helpful if a filter argument was added to the download_tableau_data function to have better control over the view data being pulled. Ideally we could pass something like filter = 'Year=2023' in the function to only return 2023 view data, and then 'vf_Year=2023' would be included in the base_url.

Tomeriko96 commented 3 months ago

Hi @chasebinns27

That would indeed be useful!

Could you check and get back to me whether the following function fits your needs?

#' Download Tableau view data as Excel
#'
#' Downloads the data from a Tableau view in Excel format.
#'
#' @param tableau A list containing the Tableau authentication variables: `base_url`, `token`, `user_id`, and `site_id`.
#' @param view_id The ID of the view to download.
#' @param path_to_save The directory to write the data Excel file to.
#' @param api_version The API version to use (default: 3.8).
#' @param filters A named list of filters to apply to the view data. The names should be the field names, and the values should be the filter values.
#'
#' @return NULL
#'
#' @export
#'
#' @family Tableau REST API
download_tableau_data <- function(tableau, view_id, path_to_save, api_version = 3.8, filters = NULL) {
  base_url <- tableau$base_url
  token <- tableau$token
  site_id <- tableau$site_id

  # Define the base URL
  base_url <- paste0(
    base_url, "api/", api_version, "/sites/",
    site_id, "/views/", view_id, "/data"
  )

  # Construct the URL with filters
  url <- base_url
  if (!is.null(filters)) {
    filter_params <- paste0("vf_", names(filters), "=", filters, collapse = "&")
    url <- paste0(url, "?", filter_params)
  }

  # Download the data as CSV
  httr::GET(
    url, httr::add_headers(`X-Tableau-Auth` = token),
    httr::write_disk(paste0(path_to_save, "data.csv"), overwrite = TRUE)
  )
}

I would then use it as follows:


filters <- list(Year = 2023)
download_tableau_data(tableau, view_id, path_to_save, filters = filters)
chasebinns27 commented 3 months ago

Thanks @Tomeriko96 ! This looks great. My only recommendation would be to add something similar to the following two lines of code within the if (!is.null(filters)) logic. This will ensure that filter arguments with spaces, slashes, and other special characters are properly accounted for. names(filters) <- sapply(names(filters), utils::URLencode, reserved = TRUE) filters <- sapply(filters, utils::URLencode, reserved = TRUE)

Tomeriko96 commented 3 months ago

@chasebinns27 Excellent addition also!

I gather the function then should look as follows:

#' Download Tableau view data as Excel
#'
#' Downloads the data from a Tableau view in Excel format.
#'
#' @param tableau A list containing the Tableau authentication variables: `base_url`, `token`, `user_id`, and `site_id`.
#' @param view_id The ID of the view to download.
#' @param path_to_save The directory to write the data Excel file to.
#' @param api_version The API version to use (default: 3.8).
#' @param filters A named list of filters to apply to the view data. The names should be the field names, and the values should be the filter values.
#' @param max_age The maximum number of minutes view data will be cached before being refreshed (optional).
#'
#' @return NULL
#'
#' @details
#' This function uses the Tableau REST API to download data from a specified view in Excel format.
#' It constructs the appropriate URL, applies any specified filters, and saves the data to the specified path.
#'
#' For more information on the Tableau REST API, see the official documentation for this method:
#' \url{https://help.tableau.com/current/api/rest_api/en-us/REST/rest_api_ref.htm#query_view_data}
#'
#' @examples
#' \dontrun{
#' tableau <- list(
#'   base_url = "https://your-tableau-server.com/",
#'   token = "your-auth-token",
#'   user_id = "your-user-id",
#'   site_id = "your-site-id"
#' )
#' view_id <- "your-view-id"
#' path_to_save <- "path/to/save/"
#' filters <- list("Region" = "North America", "Category" = "Technology")
#' max_age <- 10
#' download_tableau_data(tableau, view_id, path_to_save, filters = filters, max_age = max_age)
#' }
#'
#' @export
#' @family Tableau REST API
download_tableau_data <- function(tableau, view_id, path_to_save, api_version = 3.8, filters = NULL, max_age = NULL) {
  base_url <- tableau$base_url
  token <- tableau$token
  site_id <- tableau$site_id

  # Define the base URL
  base_url <- paste0(
    base_url, "api/", api_version, "/sites/",
    site_id, "/views/", view_id, "/data"
  )

  # Construct the URL with filters and maxAge
  url <- base_url
  params <- list()

  if (!is.null(filters)) {
    # URL encode filter names and values
    names(filters) <- sapply(names(filters), utils::URLencode, reserved = TRUE)
    filters <- sapply(filters, utils::URLencode, reserved = TRUE)
    filter_params <- paste0("vf_", names(filters), "=", filters, collapse = "&")
    params <- c(params, filter_params)
  }

  if (!is.null(max_age)) {
    params <- c(params, paste0("maxAge=", max_age))
  }

  if (length(params) > 0) {
    url <- paste0(url, "?", paste(params, collapse = "&"))
  }

  # Download the data as CSV
  httr::GET(
    url, httr::add_headers(`X-Tableau-Auth` = token),
    httr::write_disk(paste0(path_to_save, "data.csv"), overwrite = TRUE)
  )
}
chasebinns27 commented 3 months ago

@Tomeriko96 These edits look great to me. Thanks!

Tomeriko96 commented 2 months ago

Good! Commit 5d9456d3f92fcbef00756fc63e03e653f3493575 introduced the changes mentioned in this issue.

The changes are available on the development version of the package. In the coming weeks, I will look at the other methods of the package to add (optional) arguments, before submitting a new version to CRAN.

Until the CRAN update, I will keep this issue open

benjaminrobinson commented 1 week ago

Hello, @Tomeriko96! Do you have an idea of when you plan to ship new updates to CRAN? Just asking for internal planning purposes for my company's use of vvtableau. Thank you!

tin900 commented 1 week ago

Hello @benjaminrobinson ,

Unfortunately, we did not find time over summer to implement wrt adding optional arguments to existing functions. However, we certainly can submit the current state of the package to CRAN.

tin900 commented 1 week ago

Hi @benjaminrobinson Package version 0.7.0 has been submitted to CRAN and should be available shortly.