fullstorydev / hauser

Service for moving your Fullstory export files to a data warehouse
MIT License
49 stars 23 forks source link

Datetime error running hauser locally #117

Closed brittanyjoiner15 closed 2 years ago

brittanyjoiner15 commented 2 years ago

Trying to run hauser locally to test it out before setting up with bigquery, and i'm getting this error when I try to run locally:

2022/05/24 12:11:01 parsing time "" as "2006-01-02T15:04:05Z07:00": cannot parse "" as "2006"

I'm running ./hauser -c myconfig.toml from the hauser directory that includes myconfig.toml and the hauser binary.

Running on 1.3, but also tried on 1.2 and hit the same error as well. I've tried a handful of different dates and formats in the StartTime field, but they either produce formatting errors or the same error message as above. I've even copied and pasted the exact date from your docs, and its not working. I've also tried changing the UseStartTime from true to false.

Any idea what's causing this?

Here's what myconfig.toml looks like if that helps for diagnosing.

FsApiToken = "<my token>"
Backoff = "30s"
BackoffStepsMax = 8

# TmpDir is a directory where the exported files are downloaded to before they are uploaded
# to the warehouse destination. The hauser process will remove these files when it has finished
# processing them.
TmpDir = "tmp"

# ExportDuration determines the time range for each export bundle. The max value is 24 hours.
# If downloads are not completing due to timeouts, lower this value until the downloads
# are able to complete.
# Valid time units are "s", "m", "h" (seconds, minutes, hours).
ExportDuration = "6h"

# ExportDelay determines how long to wait before creating an export.
# This delay is necessary because there is some latency between when an event is recorded
# and when it is available and complete. 24 hours is the default, but is fairly conservative.
# In many cases this can be reduced safely to 3 hours, but note that "swan song" events
# may not make it into FullStory within those 3 hours.
# Valid time units are "s", "m", "h" (seconds, minutes, hours).
ExportDelay = "24h"

# StartTime determines how far back to start exporting data if starting fresh.
# This should be an timestamp like with the followin format: 2018-12-27T18:30:00Z.
# If start time is empty, this will default to 30 days in the past.
StartTime = ""

# Valid provider values:
#  * local: Used for downloading files to the local machine.
#  * gcp: Google Cloud Provider (GCS and BigQuery)
#  * aws: Amazon Web Services (S3 and Redshift)
Provider="local"
# If true, data will only be uploaded to the corresponding Provider's storage mechanism.
StorageOnly = false
SaveAsJson = false

[local]
SaveDir = "./Data"
StartTime = ""
UseStartTime = true
brittanyjoiner15 commented 2 years ago

Update - it appears it doesn't like the StartTime = "" value. Seems to work if i change that date.