-
**Purpose:**
I believe we should utilize the identifiers provided by the relevant platforms whenever feasible. Amazon has it's own internal identifiers ([ARNs](https://docs.aws.amazon.com/IAM/latest/…
-
I am implementing this in Docker.
I ran the `docker-compose up` command, everything went well, On namenode ui, when i tried to upload the file, i got the following error
`Access to XMLHttpRequest …
-
Export v1 / Import (V2) tasks should support ability to export environment variables and import them back.
Refer to https://github.com/saagie/gradle-saagie-dataops-plugin/wiki/projectsExportV1
-
Automated security updates showed these alerts:
`
Dependency Version Upgrade to
rack >= 2.0.0 ~> 2.0.6
< 2.0.6
Vu…
-
I'm using ELK 7.6 and trying to export to an s3 bucket using the csv codec. So far things work, but there's an issue because I'm uploading files every 5 mins, but only the first "part" has the headers…
-
Does the tool "copy from HDFS" communicate only via the namenode port, which is usually 50070?
Or can it use other ports like from datanodes or zookeeper?
Additional question: If the customer is n…
-
#### Problem description
When trying to run the below code:
`smart_open.open("wasb://someContainer/test.csv", transport_params={...})`
or
`smart_open.open("wasbs://someContainer/test.csv", tr…
-
With trait `Storage`, `quickwit` supports more than one storage, but there is a lot of redundant work in the middle. Perhaps we can try to use `OpenDAL`, with the community's efforts, to support more …
-
Hi,
I'm getting a segfault when trying to create a connection using the HDFileSystem constructor. The code that I'm running is:
```
from hdfs3 import HDFileSystem
hdfs = HDFileSystem([HOSTNAME],…
-
## CVE-2020-9492 - High Severity Vulnerability
Vulnerable Library - hadoop-hdfs-2.5.1.jar
Apache Hadoop HDFS
Path to dependency file: /foxtrot-core/pom.xml
Path to vulnerable library: /home/wss-scan…