-
## Bug Report
Please answer these questions before submitting your issue. Thanks!
### 1. Minimal reproduce step (Required)
1. run lightning import when file name pattern not correct and didn't …
-
Original: https://github.com/datahq/datahub-qa/issues/105
```javascript
File = require('data.js').File
// loading ISO8859 resource:
> file = File.load('https://raw.githubusercontent.com/frictionle…
-
Bots work off the additions file that looks like this:
hash_id,postal_code,company,location,notice_date,effective_date,jobs,is_temporary,is_closure,is_amendment
5d791bf6839704caea183ffc6948d0f9ba7…
-
i am trying to build a dynamic table uploaded by a user using odo but the table is not being created dynamically. is there any inbuilt method i can use or should i resort to manually creating the tabl…
-
Hi Mark,
There is currently the functionality to download the metadata records as JSON on the catalogue (for logged in users). I would like to allow for downloading in CSV format, but for that I ne…
-
I create a data package and add csv files. "Resources" information is created, including a schema for each file that is derived from the data. I can successfully add metadata on the package level, lik…
-
I used PowerQuery to recreate the time series file for today (3/24/2020, which includes 3/23/2020 as well) for confirmed cases. It took about 1.5 hours (of which 30 minutes was because I ended up real…
-
Info: Your redshift cluster have 4 slices and 2842 records in supplied pandas dataframe.
Table already exists into provided schema, Proceeding to next step.
Started uploading pandas dataframe to S3 …
-
## Environment
- **Airbyte version**:
0.42.0
- **OS Version / Instance**: example macOS, Windows 7/10, Ubuntu 18.04, GCP n2. , AWS EC2
macOS 13.2.1
- **Deployment**: example are Docker or …
vade updated
5 months ago
-
Hi There,
This looks like a great package and I was testing your package for my own automation.
So, I have created the schema of my dataframe with the infer_schema() function and it returns the …