-
Env: hadoop 2.8.5
when use UploadFile() API,
hadoop will return "HTTP/1.1 100 Continue" before "HTTP/1.1 201 Created"
ExpectContinue need to be set true to get 201 otherwise EnsureSuccessStatusCode…
-
telnet to server and send data
```
["test.abc",[[1449207484,{"cid":1,"time":1402497769,"name":"\u6155\u5bb9\u5fb7\u5eb7","ctime":1402110157}]],{"chunk":"a14492074850006"}]
```
error info
```
2015-1…
-
Currently I'm able to create file object as below when my .h5 file is in local. After creating object, I will do some operations on file object(h5_object) as below
```
import h5py
#Get file objec…
-
Export v1 / Import (V2) tasks should support ability to export environment variables and import them back.
Refer to https://github.com/saagie/gradle-saagie-dataops-plugin/wiki/projectsExportV1
-
## CVE-2020-9492 - High Severity Vulnerability
Vulnerable Library - hadoop-hdfs-2.5.1.jar
Apache Hadoop HDFS
Library home page: http://www.apache.org
Path to dependency file: /foxtrot-core/pom.xml
P…
-
### Is your feature request related to a problem? Please describe.
No. It's a common senario, we use flink as a streaming processor to do ETL jobs saving data to Hive.
### Describe the solution you'…
-
# Bug Report
`import: doesn't consider url parameter if config.local is available`
## Description
If you have set up two remotes in your `.dvc/config` file and handle their authorization in `…
-
# Bug Report
ISO 8858-1 filenames break functionnalities such as dvc exp show
## Description
A file with an ISO-8859-1 character in my case 'ç' was committed to the git repository. The git …
-
we have no access right to install vector and mount disk for remote data source.
so we can only pull data from remote by sftp.
kafka connect already has a sftp connect.
https://docs.confluent.i…
-
Automated security updates showed these alerts:
`
Dependency Version Upgrade to
rack >= 2.0.0 ~> 2.0.6
< 2.0.6
Vu…