umang8223 / issues

0 stars 0 forks source link

fast test #1

Open umang8223 opened 1 year ago

umang8223 commented 1 year ago

I have a question regarding the fast_test_script.py (https://github.com/ClickHouse/ClickHouse/blob/master/tests/ci/fast_test_check.py). When I try to run the below function in my actions:

def get_fasttest_cmd(workspace, output_path, repo_path, pr_number, commit_sha, image):
    return (
        f"docker run --cap-add=SYS_PTRACE "
        "--network=host "  # required to get access to IAM credentials
        f"-e FASTTEST_WORKSPACE=/fasttest-workspace -e FASTTEST_OUTPUT=/test_output "
        f"-e FASTTEST_SOURCE=/ClickHouse --cap-add=SYS_PTRACE "
        f"-e FASTTEST_CMAKE_FLAGS='-DCOMPILER_CACHE=sccache' "
        f"-e PULL_REQUEST_NUMBER={pr_number} -e COMMIT_SHA={commit_sha} "
        f"-e COPY_CLICKHOUSE_BINARY_TO_OUTPUT=1 "
        f"-e SCCACHE_BUCKET={S3_BUILDS_BUCKET} -e SCCACHE_S3_KEY_PREFIX=ccache/sccache "
        f"--volume={workspace}:/fasttest-workspace --volume={repo_path}:/ClickHouse "
        f"--volume={output_path}:/test_output {image}"
    )

for IBM CLOUD, I run in errors as mentioned below:

2023-06-09 21:00:09 sccache: error: Timed out waiting for server startup
2023-06-09 21:00:09 ninja: build stopped: subcommand failed.
2023-06-09 21:00:09 
2023-06-09 21:00:09 real    10.537
2023-06-09 21:00:09 user    0.390
2023-06-09 21:00:09 sys 0.311
++ jobs -pr
+ kill
kill: usage: kill [-s sigspec | -n signum | -sigspec] pid | jobspec ... or kill -l [sigspec]
+ :
++ jobs -pr
+ kill
kill: usage: kill [-s sigspec | -n signum | -sigspec] pid | jobspec ... or kill -l [sigspec]
+ :
INFO:root:Run failed

When I try more credentials to be passed in that command to authenticate with my IBM Cloud COS [S3], updated command is below:

def get_fasttest_cmd(workspace, output_path, repo_path, pr_number, commit_sha, image):
    return (
        f"docker run --cap-add=SYS_PTRACE "
        "--network=host "  # required to get access to IAM credentials
        f"-e FASTTEST_WORKSPACE=/fasttest-workspace -e FASTTEST_OUTPUT=/test_output "
        f"-e FASTTEST_SOURCE=/ClickHouse --cap-add=SYS_PTRACE "
        f"-e FASTTEST_CMAKE_FLAGS='-DCOMPILER_CACHE=sccache' "
        f"-e PULL_REQUEST_NUMBER={pr_number} -e COMMIT_SHA={commit_sha} "
        f"-e COPY_CLICKHOUSE_BINARY_TO_OUTPUT=1 "
        f"-e SCCACHE_BUCKET={S3_BUILDS_BUCKET} -e SCCACHE_S3_KEY_PREFIX=ccache/sccache "
        f"-e AWS_ACCESS_KEY_ID={AWS_ACCESS_KEY_ID} "
        f"-e AWS_SECRET_ACCESS_KEY={AWS_SECRET_ACCESS_KEY} "
        f"-e AWS_DEFAULT_REGION={S3_REGION} "
        f"-e S3_ENDPOINT={S3_ENDPOINT} "
        f"--volume={workspace}:/fasttest-workspace --volume={repo_path}:/ClickHouse "
        f"--volume={output_path}:/test_output {image}"
    )

The failure in logs for the above modified command is mentioned below:

+ ccache --show-stats
Summary:
  Hits:               0 /     0
    Direct:           0 /     0
    Preprocessed:     0 /     0
  Misses:             0
    Direct:           0
    Preprocessed:     0
Primary storage:
  Hits:               0 /     0
  Misses:             0
  Cache size (GB): 0.00 / 15.00 (0.00 %)

Use the -v/--verbose option for more details.
+ SCCACHE_NO_DAEMON=1
+ sccache --show-stats
sccache: error: Server startup failed: cache storage failed to read: ObjectPermissionDenied (permanent) at read => S3Error { code: "InvalidAccessKeyId", message: "The AWS Access Key Id you provided does not exist in our records.", resource: "", request_id: "9XC43C3A9XD8K7EW" }

Context:
    response: Parts { status: 403, version: HTTP/1.1, headers: {"x-amz-request-id": "9XC43C3A9XD8K7EW", "x-amz-id-2": "HEveYw+VguijwKfasPvKC8xy174XxHBF39hrLgMGc6Bs51AGUI3RbAHiTggRmdz/nhp+DGG1wOQ=", "content-type": "application/xml", "transfer-encoding": "chunked", "date": "Fri, 09 Jun 2023 21:39:59 GMT", "server": "AmazonS3"} }
    service: s3
    path: .sccache_check
    range: 0-

sccache: error: cache storage failed to read: ObjectPermissionDenied (permanent) at read => S3Error { code: "InvalidAccessKeyId", message: "The AWS Access Key Id you provided does not exist in our records.", resource: "", request_id: "9XC43C3A9XD8K7EW" }

Context:
    response: Parts { status: 403, version: HTTP/1.1, headers: {"x-amz-request-id": "9XC43C3A9XD8K7EW", "x-amz-id-2": "HEveYw+VguijwKfasPvKC8xy174XxHBF39hrLgMGc6Bs51AGUI3RbAHiTggRmdz/nhp+DGG1wOQ=", "content-type": "application/xml", "transfer-encoding": "chunked", "date": "Fri, 09 Jun 2023 21:39:59 GMT", "server": "AmazonS3"} }
    service: s3
    path: .sccache_check
    range: 0-

I also tried similar alternative commands as below: It is also running into errors. I even tried removing the "--network=host “ from below command.

def get_fasttest_cmd(workspace, output_path, repo_path, pr_number, commit_sha, image):
    return (
        f"docker run --cap-add=SYS_PTRACE "
        "--network=host "  # required to get access to IAM credentials
        f"-e FASTTEST_WORKSPACE=/fasttest-workspace -e FASTTEST_OUTPUT=/test_output "
        f"-e FASTTEST_SOURCE=/ClickHouse --cap-add=SYS_PTRACE "
        f"-e FASTTEST_CMAKE_FLAGS='-DCOMPILER_CACHE=sccache' "
        f"-e PULL_REQUEST_NUMBER={pr_number} -e COMMIT_SHA={commit_sha} "
        f"-e COPY_CLICKHOUSE_BINARY_TO_OUTPUT=1 "
        f"-e SCCACHE_BUCKET={S3_BUILDS_BUCKET} -e SCCACHE_S3_KEY_PREFIX=ccache/sccache "
        f"-e IBM_CLOUD_API_KEY={IBM_CLOUD_API_KEY} "
        f"-e AWS_DEFAULT_REGION={S3_REGION} "
        f"-e S3_ENDPOINT={S3_ENDPOINT} "
        f"--volume={workspace}:/fasttest-workspace --volume={repo_path}:/ClickHouse "
        f"--volume={output_path}:/test_output {image}"
    )
Felixoid commented 1 year ago

You are very welcome to create the issue in ClickHouse/ClickHouse. Hiding discussion in some unrelated issue is not a good way to go

umang8223 commented 1 year ago

You are very welcome to create the issue in ClickHouse/ClickHouse. Hiding discussion in some unrelated issue is not a good way to go

Thank you @Felixoid. I have created a new issue in ClickHouse/ClickHouse https://github.com/ClickHouse/ClickHouse/issues/50872

please see if you can have look once, stuck at this problem for couple of days.