Closed tanishq-akula closed 3 months ago
Please show us your manifest.template file
Contents of client.manifest.template
# Secret Provisioning manifest file example (Protected Files client)
loader.entrypoint = "file:{{ gramine.libos }}"
libos.entrypoint = "/client"
loader.log_level = "{{ log_level }}"
loader.env.LD_LIBRARY_PATH = "/lib:{{ arch_libdir }}:/usr{{ arch_libdir }}"
loader.env.LD_PRELOAD = "libsecret_prov_attest.so"
loader.env.SECRET_PROVISION_CONSTRUCTOR = "1"
loader.env.SECRET_PROVISION_SET_KEY = "default"
loader.env.SECRET_PROVISION_CA_CHAIN_PATH = "/ca.crt"
loader.env.SECRET_PROVISION_SERVERS = "dummyserver:80;localhost:4433;anotherdummy:4433"
loader.insecure__use_cmdline_argv = true
fs.mounts = [
{ path = "/lib", uri = "file:{{ gramine.runtimedir() }}" },
{ path = "{{ arch_libdir }}", uri = "file:{{ arch_libdir }}" },
{ path = "/usr{{ arch_libdir }}", uri = "file:/usr{{ arch_libdir }}" },
{ path = "/etc", uri = "file:/etc" },
{ path = "/client", uri = "file:client" },
{ path = "/ca.crt", uri = "file:../ssl/ca.crt" },
{ path = "/files/", uri = "file:enc_files/", type = "encrypted" },
]
sys.enable_extra_runtime_domain_names_conf = true
sgx.enclave_size = "512M"
sgx.debug = true
sgx.remote_attestation = "{{ ra_type }}"
sgx.ra_client_spid = "{{ ra_client_spid }}"
sgx.ra_client_linkable = {{ 'true' if ra_client_linkable == '1' else 'false' }}
sgx.trusted_files = [
"file:{{ gramine.libos }}",
"file:client",
"file:{{ gramine.runtimedir() }}/",
"file:{{ arch_libdir }}/",
"file:/usr{{ arch_libdir }}/",
"file:../ssl/ca.crt",
]
sgx.allowed_files = [
"file:/etc/nsswitch.conf",
"file:/etc/ethers",
"file:/etc/host.conf",
"file:/etc/hosts",
"file:/etc/group",
"file:/etc/passwd",
"file:/etc/gai.conf",
]
Hm, can you set loader.log_level = "all"
and show the complete Gramine log?
Attached client gramine log Client_log.txt
Ok, something is totally broken:
[P1:T1:client] trace: ---- openat(AT_FDCWD, "/lib/libc.so.6", O_RDONLY|0x80000, 0000) = 0x3
...
[P1:T1:client] trace: ---- openat(AT_FDCWD, "/usr/lib64/tls/libpthread.so.0", O_RDONLY|0x80000, 0000) = -2
[P1:T1:client] trace: ---- openat(AT_FDCWD, "/usr/lib64/libpthread.so.0", O_RDONLY|0x80000, 0000) = 0x3
So we see that libc.so
is correctly taken from Gramine-specific /lib
directory. But libpthread.so
is taken from system-wide /usr/lib64
directory.
Why this happens in the same process, I don't know.
@tanishq-akula How exactly did you install Gramine? Did you modify anything in Gramine or in the ra-tls-secret-prov
example?
Hey Dmitrii,
We directly installed Gramine with dnf command on RHEL 8.6 host as specified in the documentation.
https://gramine.readthedocs.io/en/stable/quickstart.html
To get other DCAP libraries we have built Gramine with the meson command as per building instructions you have specified above.
We didn't modify anything in gramine or in any of the examples.
@tanishq-akula Did you git checkout
to appropriate tag when cloning the repo and building stuff? I don't see any instruction given by @dimakuv above, so not sure what you are talking about.
I used below link for building Gramine with DCAP, https://gramine.readthedocs.io/en/stable/devel/building.html
I used below command to clone gramine repo as per quick start guide
git clone --depth 1 --branch v1.3.1 https://github.com/gramineproject/gramine.git
[root@zis26 CI-Examples]# git show --oneline -s
e18bc05 (grafted, HEAD, tag: v1.3.1) Bump Gramine version to 1.3.1
@tanishq-akula Could you show the output of ls -la /usr/lib64/gramine/runtime/glibc
?
Also, how did you build the ra-tls-secret-prov
example? Can you show the exact commands that you used?
[root@zis26 CI-Examples]# ls -la /usr/lib64/gramine/runtime/glibc
total 5272
drwxr-xr-x 2 root root 4096 Nov 16 22:16 .
drwxr-xr-x 4 root root 31 Nov 2 20:13 ..
-rw-r--r-- 1 root root 2176 Sep 29 14:42 crt1.o
-rw-r--r-- 1 root root 1216 Sep 29 14:42 crti.o
-rw-r--r-- 1 root root 648 Sep 29 14:42 crtn.o
-rwxr-xr-x 1 root root 242848 Sep 29 14:42 ld-linux-x86-64.so.2
lrwxrwxrwx 1 root root 20 Sep 29 14:42 ld.so -> ld-linux-x86-64.so.2
lrwxrwxrwx 1 root root 11 Sep 29 14:42 libanl.so -> libanl.so.1
-rwxr-xr-x 1 root root 8312 Sep 29 14:42 libanl.so.1
lrwxrwxrwx 1 root root 9 Sep 29 14:42 libc.so -> libc.so.6
-rwxr-xr-x 1 root root 2359160 Sep 29 14:42 libc.so.6
lrwxrwxrwx 1 root root 10 Sep 29 14:42 libdl.so -> libdl.so.2
-rwxr-xr-x 1 root root 8664 Sep 29 14:42 libdl.so.2
lrwxrwxrwx 1 root root 32 Sep 29 14:42 libmbedcrypto_gramine.a -> ../../../libmbedcrypto_gramine.a
lrwxrwxrwx 1 root root 33 Sep 29 14:42 libmbedcrypto_gramine.so -> ../../../libmbedcrypto_gramine.so
lrwxrwxrwx 1 root root 36 Sep 29 14:42 libmbedcrypto_gramine.so.12 -> ../../../libmbedcrypto_gramine.so.12
lrwxrwxrwx 1 root root 29 Sep 29 14:42 libmbedtls_gramine.a -> ../../../libmbedtls_gramine.a
lrwxrwxrwx 1 root root 30 Sep 29 14:42 libmbedtls_gramine.so -> ../../../libmbedtls_gramine.so
lrwxrwxrwx 1 root root 33 Sep 29 14:42 libmbedtls_gramine.so.18 -> ../../../libmbedtls_gramine.so.18
lrwxrwxrwx 1 root root 30 Sep 29 14:42 libmbedx509_gramine.a -> ../../../libmbedx509_gramine.a
lrwxrwxrwx 1 root root 31 Sep 29 14:42 libmbedx509_gramine.so -> ../../../libmbedx509_gramine.so
lrwxrwxrwx 1 root root 33 Sep 29 14:42 libmbedx509_gramine.so.4 -> ../../../libmbedx509_gramine.so.4
lrwxrwxrwx 1 root root 9 Sep 29 14:42 libm.so -> libm.so.6
-rwxr-xr-x 1 root root 993808 Sep 29 14:42 libm.so.6
lrwxrwxrwx 1 root root 12 Sep 29 14:42 libmvec.so -> libmvec.so.1
-rwxr-xr-x 1 root root 1068824 Sep 29 14:42 libmvec.so.1
lrwxrwxrwx 1 root root 11 Sep 29 14:42 libnsl.so -> libnsl.so.1
-rwxr-xr-x 1 root root 112752 Sep 29 14:42 libnsl.so.1
lrwxrwxrwx 1 root root 18 Sep 29 14:42 libnss_compat.so -> libnss_compat.so.2
-rwxr-xr-x 1 root root 38544 Sep 29 14:42 libnss_compat.so.2
lrwxrwxrwx 1 root root 14 Sep 29 14:42 libnss_db.so -> libnss_db.so.2
-rwxr-xr-x 1 root root 37312 Sep 29 14:42 libnss_db.so.2
lrwxrwxrwx 1 root root 15 Sep 29 14:42 libnss_dns.so -> libnss_dns.so.2
-rwxr-xr-x 1 root root 8000 Sep 29 14:42 libnss_dns.so.2
lrwxrwxrwx 1 root root 17 Sep 29 14:42 libnss_files.so -> libnss_files.so.2
-rwxr-xr-x 1 root root 8000 Sep 29 14:42 libnss_files.so.2
lrwxrwxrwx 1 root root 15 Sep 29 14:42 libpthread.so -> libpthread.so.0
-rwxr-xr-x 1 root root 10056 Sep 29 14:42 libpthread.so.0
lrwxrwxrwx 1 root root 28 Sep 29 14:42 libra_tls_attest.so -> ../../../libra_tls_attest.so
-rwxr-xr-x 1 root root 52248 Nov 16 05:10 libra_tls_verify_dcap_gramine.so
lrwxrwxrwx 1 root root 33 Sep 29 14:42 libra_tls_verify_epid.so -> ../../../libra_tls_verify_epid.so
lrwxrwxrwx 1 root root 14 Sep 29 14:42 libresolv.so -> libresolv.so.2
-rwxr-xr-x 1 root root 70872 Sep 29 14:42 libresolv.so.2
lrwxrwxrwx 1 root root 10 Sep 29 14:42 librt.so -> librt.so.1
-rwxr-xr-x 1 root root 9352 Sep 29 14:42 librt.so.1
lrwxrwxrwx 1 root root 33 Sep 29 14:42 libsecret_prov_attest.so -> ../../../libsecret_prov_attest.so
-rwxr-xr-x 1 root root 267944 Nov 10 23:41 libsecret_prov_verify_dcap.so
lrwxrwxrwx 1 root root 38 Sep 29 14:42 libsecret_prov_verify_epid.so -> ../../../libsecret_prov_verify_epid.so
lrwxrwxrwx 1 root root 23 Sep 29 14:42 libsgx_util.so -> ../../../libsgx_util.so
lrwxrwxrwx 1 root root 17 Sep 29 14:42 libthread_db.so -> libthread_db.so.1
-rwxr-xr-x 1 root root 40960 Sep 29 14:42 libthread_db.so.1
lrwxrwxrwx 1 root root 12 Sep 29 14:42 libutil.so -> libutil.so.1
-rwxr-xr-x 1 root root 8312 Sep 29 14:42 libutil.so.1
Command to build ra-tls-secret-prov
:- ARCH_LIBDIR=/lib64 make app dcap RA_TYPE=dcap
After this we executed below commands to run the application:-
cd secret_prov_pf
RA_TLS_ALLOW_DEBUG_ENCLAVE_INSECURE=1 \
RA_TLS_ALLOW_OUTDATED_TCB_INSECURE=1 \
./server_dcap wrap_key &
gramine-sgx ./client
Everything looks perfect... I just don't see why the app tries to open /usr/lib64/libpthread.so.0
instead of /usr/lib64/gramine/runtime/glibc/libpthread.so.0
...
Do other programs (other than ra-tls-secret-prov
) run fine under Gramine on your system? What about e.g. Python examples?
From the log looks like after libc.so
is loaded, LD_LIBRARY_PATH
starts to be ignored?
@tanishq-akula Please add loader.env.LD_DEBUG = "libs"
to your manifest, rebuild, rerun and provide us with Gramine logs + output.
This generally looks weird, I have no idea what's wrong
@boryspoplawski I added loader.env.LD_DEBUG = "libs"
to the client.manifest.template
file, rebuilt and reran as suggested.
Please refer attached file for Gramine logs and output.
Client_Log.txt
Everything looks perfect... I just don't see why the app tries to open
/usr/lib64/libpthread.so.0
instead of/usr/lib64/gramine/runtime/glibc/libpthread.so.0
...Do other programs (other than
ra-tls-secret-prov
) run fine under Gramine on your system? What about e.g. Python examples?
@dimakuv Other programs in ra-tls-secret-prov
are throwing the same error. The Redis and Nginx examples ran successfully but I encountered following error while building the Python program
[root@zis26 python]# ARCH_LIBDIR=/lib64 make DEBUG=1 SGX=1 RA_TYPE=dcap
gramine-manifest \
-Dlog_level=debug \
-Darch_libdir=/lib64 \
-Dentrypoint=/usr/libexec/platform-python3.6 \
-Dra_type=dcap \
-Dra_client_spid= \
-Dra_client_linkable=0 \
python.manifest.template >python.manifest
gramine-sgx-sign \
--manifest python.manifest \
--output python.manifest.sgx
Traceback (most recent call last):
File "/usr/bin/gramine-sgx-sign", line 74, in <module>
main() # pylint: disable=no-value-for-parameter
File "/usr/lib/python3.6/site-packages/click/core.py", line 721, in __call__
return self.main(*args, **kwargs)
File "/usr/lib/python3.6/site-packages/click/core.py", line 696, in main
rv = self.invoke(ctx)
File "/usr/lib/python3.6/site-packages/click/core.py", line 894, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/lib/python3.6/site-packages/click/core.py", line 534, in invoke
return callback(*args, **kwargs)
File "/usr/bin/gramine-sgx-sign", line 34, in main
expanded = manifest.expand_all_trusted_files()
File "/usr/lib64/python3.6/site-packages/graminelibos/manifest.py", line 175, in expand_all_trusted_files
append_trusted_dir_or_file(trusted_files, tf, expanded)
File "/usr/lib64/python3.6/site-packages/graminelibos/manifest.py", line 56, in append_trusted_dir_or_file
raise ManifestError(f'Cannot resolve {path}')
graminelibos.manifest.ManifestError: Cannot resolve /usr/lib64/python3/dist-packages
make: *** [Makefile:36: sgx_sign] Error 1
Ok, another person reported the same problem on CentOS 8 Stream (which is basically the same as RHEL 8).
The most interesting part is that it worked before -- there were no changes to GSC and Gramine (because the versions are pinned in that other person's case).
So the only root cause is that there was a breaking update in CentOS/RHEL. This needs debugging...
@tanishq-akula What's the content of LD_LIBRARY_PATH
in final manifest (i.e. after rendering)? The file is probably named client.manifest.sgx
Wait: RUNPATH from file /client
, how did you compile the program and why it has RUNPATH
?
Ah, this is secret_prof_pf
example.
I'm pretty sure $(shell pkg-config --libs sgx_util)
is at fault. Why it affect how paths are resolved and why is LD_LIBRARY_PATH
ignored after loading libc, that I do not know.
@tanishq-akula @boryspoplawski I debugged a similar error on another workload on a CentOS 8 machine.
So in that other case, the root cause was this:
$ strings /usr/lib64/libtcmalloc.so.4 | grep lib64
/usr/lib/../lib64
So there was this library libtcmalloc.so.4
which was loaded inside of the SGX enclave (in fact, it was the second library to be loaded, right after libc.so
). And this particular version of the library, installed on this CentOS 8 machine, has the absolute path to dependencies hard-coded.
So for example this library libtcmalloc.so.4
has a dependency on libpthread.so
. So it loaded it directly from /usr/lib/../lib64/libpthread.so
which after path normalization is simply /usr/lib64/libpthread.so
. So pthread was not taken from Gramine-specific Glibc path. And at this point, there is a combination of wrong versions of libpthread.so
(system-wide lib) and libc.so
(Gramine-specific lib).
I'm pretty sure the same problem happens in @tanishq-akula case. @tanishq-akula, could you check the output of:
strings /usr/lib64/libcurl.so.4 | grep lib64
strings /usr/lib64/libcjson.so.1 | grep lib64
Just for your information: I was also considering that maybe the problem is in this quarantine feature: https://modules.readthedocs.io/en/v4.1.0/MIGRATING.html#quarantine-mechanism-to-protect-module-execution
That's because I observed MODULES_RUN_QUARANTINE=LD_LIBRARY_PATH LD_PRELOAD
environment variable set in that other CentOS8-based Docker image. So I thought maybe this envvar destroyed LD_LIBRARY_PATH
that was specified in Gramine's manifest. But this was not the case.
I'll leave this observation here just for future reference -- maybe some day we'll hit this issue too...
@dimakuv strings /usr/lib64/libcurl.so.4 | grep lib64
&strings /usr/lib64/libcjson.so.1 | grep lib64
are not producing any output.
@tanishq-akula Thanks for checking. The situation is worse than I thought :)
Ok, so two more things:
First, you can just stop using Gramine-specific Glibc files. For this, remove these lines in the manifest.template
:
{ path = "/lib", uri = "file:{{ gramine.runtimedir() }}" },
(an entry in fs.mounts
)"file:{{ gramine.runtimedir() }}/",
(an entry in sgx.trusted_files
)/lib:
from LD_LIBRARY_PATH
(it becomes useless)Could you give us some ideas how to reproduce this issue? We don't have access to RHEL 8.6, and I cannot reproduce this problem on CentOS 8 or CentOS 8 Stream.
Notice that the first item is more like a workaround -- it disabled Gramine-specific libc.so
, pthread.so
, and other common libraries. This leads Gramine to use system-wide libraries (thus, I expect the mismatch between versions to disappear, because all libs will be system-wide now). But this comes at the cost of worse performance of Gramine execution.
@dimakuv 1.With your workaround the application worked. Here is the output,
[root@zis26 secret_prov_pf]# gramine-sgx ./client
Gramine is starting. Parsing TOML manifest file, this may take some time...
-----------------------------------------------------------------------------------------------------------------------
Gramine detected the following insecure configurations:
- sgx.debug = true (this is a debug enclave)
- loader.insecure__use_cmdline_argv = true (forwarding command-line args from untrusted host to the app)
- sgx.allowed_files = [ ... ] (some files are passed through from untrusted host without verification)
Gramine will continue application execution, but this configuration must not be used in production!
-----------------------------------------------------------------------------------------------------------------------
Emulating a raw syscall instruction. This degrades performance, consider patching your application to use Gramine syscall API.
Received the following measurements from the client:
- MRENCLAVE: 06019f727ac3bbb9856dfa68e074c794a3d57777fa50c01b97115f2d54da215e
- MRSIGNER: a4ac310260bec0dd47797f54e9fb64dbd5d2451280fc1a6b994508e4c7a56e58
- ISV_PROD_ID: 0
- ISV_SVN: 0
[ WARNING: In reality, you would want to compare against expected values! ]
--- [main] Re-starting myself using execvp() ---
Emulating a raw syscall instruction. This degrades performance, consider patching your application to use Gramine syscall API.
--- [child] Read from protected file: 'helloworld' ---
--- [parent] Read from protected file: 'helloworld' ---
[root@zis26 secret_prov_pf]#
We have bare-metal servers with RHEL installed on them. One way to emulate this issue is to install RHEL on a VM. RedHat provides free license for RHEL OS under it's 'No-cost RHEL for developers subscription' program. But we haven't tried it yet.
Oof. That sounds complicated, I don't want to register for such a program just to test RHEL.
@anjalirai-intel @aneessahib Do we maybe have RHEL 8? Did we encounter similar problems?
@dimakuv We have RHEL 8 available with us. But we have not run mbedtls example there. We can check for mbedtls but rest all examples work fine
@tanishq-akula To build Python example on RHEL, please use updated python.manifest.template from Gramine Repo because dist-packages path does not exist there https://github.com/gramineproject/gramine/blob/master/CI-Examples/python/python.manifest.template
@anjalirai-intel We are getting below error with new manifest in python use case.
[root@zis26 python]# ARCH_LIBDIR=/lib64 make SGX=1
gramine-manifest \
-Dlog_level=error \
-Darch_libdir=/lib64 \
-Dentrypoint=/usr/libexec/platform-python3.6 \
-Dra_type=none \
-Dra_client_spid= \
-Dra_client_linkable=0 \
python.manifest.template >python.manifest
Traceback (most recent call last):
File "/usr/bin/gramine-manifest", line 33, in <module>
main() # pylint: disable=no-value-for-parameter
File "/usr/lib/python3.6/site-packages/click/core.py", line 721, in __call__
return self.main(*args, **kwargs)
File "/usr/lib/python3.6/site-packages/click/core.py", line 696, in main
rv = self.invoke(ctx)
File "/usr/lib/python3.6/site-packages/click/core.py", line 894, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/lib/python3.6/site-packages/click/core.py", line 534, in invoke
return callback(*args, **kwargs)
File "/usr/bin/gramine-manifest", line 29, in main
manifest = Manifest.from_template(template, define)
File "/usr/lib64/python3.6/site-packages/graminelibos/manifest.py", line 142, in from_template
return cls(_env.from_string(template).render(**(variables or {})))
File "/usr/local/lib64/python3.6/site-packages/jinja2/environment.py", line 1090, in render
self.environment.handle_exception()
File "/usr/local/lib64/python3.6/site-packages/jinja2/environment.py", line 832, in handle_exception
reraise(*rewrite_traceback_stack(source=source))
File "/usr/local/lib64/python3.6/site-packages/jinja2/_compat.py", line 28, in reraise
raise value.with_traceback(tb)
File "<template>", line 23, in top-level template code
jinja2.exceptions.UndefinedError: 'dict object' has no attribute 'get_sys_path'
make: *** [Makefile:20: python.manifest] Error 1
I just had a look at github repo and looks like we are missing changes to library files (e.g., gen_jinja_env.py) as we have cloned v1.3.1 branch. Could you please share the build and install procedure with the repo instead of dnf install commands on RHEL ? We will re-install gramine and try out all use cases again
@tanishq-akula If you want to build the latest Gramine from sources, then smth like this:
Uninstall the current Gramine installation!
Build and install Gramine:
git clone https://github.com/gramineproject/gramine.git
cd gramine
meson setup build/ --buildtype=release -Ddirect=enabled -Dsgx=enabled -Ddcap=enabled
ninja -C build/
sudo ninja -C build/ install
It should be meson setup build/ --buildtype=release -Ddirect=enabled -Dsgx=enabled -Ddcap=enabled -Dtests=enabled ninja -C build sudo ninja -C build/ install
@tanishq-akula I came up with another workaround to your original problem of libpthread.so.0
error.
Instead of the things that I recommended in an above comment (where you need to remove Gramine-specific Glibc libraries completely), you can also do this:
fs.mounts
like this: { path = "/usr/lib64/libpthread.so.0", uri = "file:{{ gramine.runtimedir() }}/libpthread.so.0" },
What this does is substitutes a single file /usr/lib64/libpthread.so.0
with Gramine-specific some/gramine/path/glibc/libpthread.so.0
. Now your enclavized application searches for this file and finds a Gramine-specific version, which is exactly what we want.
This is not a very good fix to your issue, because you'll have to do similar tricks with all libs that exhibit similar problems. But it can circumvent your current issue, and the performance will be good.
Our Current RHEL infrastructure does not DCAP Attestation
@tanishq-akula If you want to build the latest Gramine from sources, then smth like this:
- Uninstall the current Gramine installation!
- Build and install Gramine:
git clone https://github.com/gramineproject/gramine.git cd gramine meson setup build/ --buildtype=release -Ddirect=enabled -Dsgx=enabled -Ddcap=enabled ninja -C build/ sudo ninja -C build/ install
@dimakuv We cloned a new gramine repo, followed the quoted steps and ran ra-tls-secret-prov
example. The following error was encountered while running client.
Error.txt
@tanishq-akula Have you configured your no_proxy correctly if you are under proxy settings
@anjalirai-intel We haven't configured no_proxy, as we are not using any proxy setup.
[root@zis26 ra-tls-secret-prov]# env | grep -i proxy
https_proxy=
http_proxy=
no_proxy=
@tanishq-akula Just to check your pccs server is installed in same localhost or is it in different machine. If it is present in same machine, then add no_proxy=localhost
@anjalirai-intel ra-tls-mbedtls
example ran successfully with DCAP and we are able to see logs in pccs for attestation, so the issue doesn't seem to be with our current setup. We are only facing problem while executing ra-tls-secret-prov
example.
@boryspoplawski @mkow Can you please look at the issue
@tanishq-akula Could you provide a bit more info?
ran
ra-tls-secret-prov
example
Ran how exactly? Can you show all the commands you typed?
The following error was encountered while running client.
Error.txt
The client connects to the server (Secret Prov server) but then sees an EPIPE
(broken pipe) at some point. Which means that the server shut down the connection prematurely for some reason. Can you show also the log of the server run?
In general, this feels like a problem with either:
RA_TLS_ALLOW_DEBUG_ENCLAVE_INSECURE=1
, RA_TLS_ALLOW_OUTDATED_TCB_INSECURE=1
)localhost
(see https://github.com/gramineproject/gramine/blob/da990909010a5989dc89c63f4c7a22d78fb3f5c6/CI-Examples/ra-tls-secret-prov/ssl/ca_config.conf#L14), but the server is located not on localhost but e.g. outside of the Docker-container network.@dimakuv We are running everything on localhost. Commands executed:
ARCH_LIBDIR=/lib64 make app dcap RA_TYPE=dcap
RA_TLS_ALLOW_DEBUG_ENCLAVE_INSECURE=1 RA_TLS_ALLOW_OUTDATED_TCB_INSECURE=1 ./server_dcap wrap_key &
gramine-sgx ./client
Server log:-
[root@zis26 secret_prov_pf]# --- Reading the master key for encrypted files from 'wrap_key' ---
--- Starting the Secret Provisioning server on port 4433 ---
client_connection: Secret Provisioning failed during mbedtls_ssl_handshake with error -15104
client_connection: Secret Provisioning failed during mbedtls_ssl_handshake with error -15104
Hm, this error code is: mbedtls_ssl_handshake() failed: -0x3b00 (-15104): PK - The pubkey tag or value is invalid (only RSA and EC are supported)
This is very surprising, I haven't seen errors like this.
Could you perform ldd server_dcap
and show us the output? Can you do the same for client: ldd client
?
ldd server_dcap
output :-
[root@zis26 secret_prov_pf]# ldd server_dcap
linux-vdso.so.1 (0x00007ffef4ffe000)
libsgx_util.so => /usr/lib64/libsgx_util.so (0x00007f1a25949000)
libsgx_urts.so => /usr/lib64/libsgx_urts.so (0x00007f1a25fd7000)
libsecret_prov_verify_dcap.so => /usr/lib64/libsecret_prov_verify_dcap.so (0x00007f1a2570f000)
libpthread.so.0 => /usr/lib64/libpthread.so.0 (0x00007f1a254ef000)
libc.so.6 => /usr/lib64/libc.so.6 (0x00007f1a2512a000)
libcjson.so.1 => /usr/lib64/libcjson.so.1 (0x00007f1a24f22000)
libcurl.so.4 => /usr/lib64/libcurl.so.4 (0x00007f1a24c94000)
libsgx_enclave_common.so.1 => /lib64/libsgx_enclave_common.so.1 (0x00007f1a25fba000)
libdl.so.2 => /lib64/libdl.so.2 (0x00007f1a24a90000)
libstdc++.so.6 => /lib64/libstdc++.so.6 (0x00007f1a246fb000)
libm.so.6 => /lib64/libm.so.6 (0x00007f1a24379000)
libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007f1a24161000)
libsgx_dcap_quoteverify.so.1 => /lib64/libsgx_dcap_quoteverify.so.1 (0x00007f1a23f3c000)
/lib64/ld-linux-x86-64.so.2 (0x00007f1a25dd6000)
libnghttp2.so.14 => /usr/lib64/libnghttp2.so.14 (0x00007f1a23d15000)
libidn2.so.0 => /usr/lib64/libidn2.so.0 (0x00007f1a23af7000)
libssh.so.4 => /usr/lib64/libssh.so.4 (0x00007f1a23888000)
libpsl.so.5 => /usr/lib64/libpsl.so.5 (0x00007f1a23677000)
libssl.so.1.1 => /usr/lib64/libssl.so.1.1 (0x00007f1a233e3000)
libcrypto.so.1.1 => /usr/lib64/libcrypto.so.1.1 (0x00007f1a22efa000)
libgssapi_krb5.so.2 => /usr/lib64/libgssapi_krb5.so.2 (0x00007f1a22ca5000)
libkrb5.so.3 => /usr/lib64/libkrb5.so.3 (0x00007f1a229bb000)
libk5crypto.so.3 => /usr/lib64/libk5crypto.so.3 (0x00007f1a227a4000)
libcom_err.so.2 => /usr/lib64/libcom_err.so.2 (0x00007f1a225a0000)
libldap-2.4.so.2 => /usr/lib64/libldap-2.4.so.2 (0x00007f1a22351000)
liblber-2.4.so.2 => /usr/lib64/liblber-2.4.so.2 (0x00007f1a22141000)
libbrotlidec.so.1 => /usr/lib64/libbrotlidec.so.1 (0x00007f1a21f34000)
libz.so.1 => /usr/lib64/libz.so.1 (0x00007f1a21d1c000)
libunistring.so.2 => /usr/lib64/libunistring.so.2 (0x00007f1a2199b000)
librt.so.1 => /usr/lib64/librt.so.1 (0x00007f1a21793000)
libkrb5support.so.0 => /usr/lib64/libkrb5support.so.0 (0x00007f1a21582000)
libkeyutils.so.1 => /usr/lib64/libkeyutils.so.1 (0x00007f1a2137e000)
libresolv.so.2 => /usr/lib64/libresolv.so.2 (0x00007f1a21167000)
libsasl2.so.3 => /usr/lib64/libsasl2.so.3 (0x00007f1a20f49000)
libbrotlicommon.so.1 => /usr/lib64/libbrotlicommon.so.1 (0x00007f1a20d28000)
libselinux.so.1 => /usr/lib64/libselinux.so.1 (0x00007f1a20afe000)
libcrypt.so.1 => /usr/lib64/libcrypt.so.1 (0x00007f1a208d5000)
libpcre2-8.so.0 => /usr/lib64/libpcre2-8.so.0 (0x00007f1a20651000)
ldd client
output :-
[root@zis26 secret_prov_pf]# ldd client
linux-vdso.so.1 (0x00007fff347d3000)
libsgx_util.so => /usr/lib64/libsgx_util.so (0x00007f83c266f000)
libc.so.6 => /usr/lib64/libc.so.6 (0x00007f83c22aa000)
libcjson.so.1 => /usr/lib64/libcjson.so.1 (0x00007f83c20a2000)
libcurl.so.4 => /usr/lib64/libcurl.so.4 (0x00007f83c1e14000)
/lib64/ld-linux-x86-64.so.2 (0x00007f83c2afc000)
libm.so.6 => /usr/lib64/libm.so.6 (0x00007f83c1a92000)
libnghttp2.so.14 => /usr/lib64/libnghttp2.so.14 (0x00007f83c186b000)
libidn2.so.0 => /usr/lib64/libidn2.so.0 (0x00007f83c164d000)
libssh.so.4 => /usr/lib64/libssh.so.4 (0x00007f83c13de000)
libpsl.so.5 => /usr/lib64/libpsl.so.5 (0x00007f83c11cd000)
libssl.so.1.1 => /usr/lib64/libssl.so.1.1 (0x00007f83c0f39000)
libcrypto.so.1.1 => /usr/lib64/libcrypto.so.1.1 (0x00007f83c0a50000)
libgssapi_krb5.so.2 => /usr/lib64/libgssapi_krb5.so.2 (0x00007f83c07fb000)
libkrb5.so.3 => /usr/lib64/libkrb5.so.3 (0x00007f83c0511000)
libk5crypto.so.3 => /usr/lib64/libk5crypto.so.3 (0x00007f83c02fa000)
libcom_err.so.2 => /usr/lib64/libcom_err.so.2 (0x00007f83c00f6000)
libldap-2.4.so.2 => /usr/lib64/libldap-2.4.so.2 (0x00007f83bfea7000)
liblber-2.4.so.2 => /usr/lib64/liblber-2.4.so.2 (0x00007f83bfc97000)
libbrotlidec.so.1 => /usr/lib64/libbrotlidec.so.1 (0x00007f83bfa8a000)
libz.so.1 => /usr/lib64/libz.so.1 (0x00007f83bf872000)
libpthread.so.0 => /usr/lib64/libpthread.so.0 (0x00007f83bf652000)
libunistring.so.2 => /usr/lib64/libunistring.so.2 (0x00007f83bf2d1000)
librt.so.1 => /usr/lib64/librt.so.1 (0x00007f83bf0c9000)
libdl.so.2 => /usr/lib64/libdl.so.2 (0x00007f83beec5000)
libkrb5support.so.0 => /usr/lib64/libkrb5support.so.0 (0x00007f83becb4000)
libkeyutils.so.1 => /usr/lib64/libkeyutils.so.1 (0x00007f83beab0000)
libresolv.so.2 => /usr/lib64/libresolv.so.2 (0x00007f83be899000)
libsasl2.so.3 => /usr/lib64/libsasl2.so.3 (0x00007f83be67b000)
libbrotlicommon.so.1 => /usr/lib64/libbrotlicommon.so.1 (0x00007f83be45a000)
libselinux.so.1 => /usr/lib64/libselinux.so.1 (0x00007f83be230000)
libcrypt.so.1 => /usr/lib64/libcrypt.so.1 (0x00007f83be007000)
libpcre2-8.so.0 => /usr/lib64/libpcre2-8.so.0 (0x00007f83bdd83000)
Hm, nothing special that I see.
Could you also show the result of ldd /usr/lib64/libsecret_prov_verify_dcap.so
?
@dimakuv here is the output,
[root@zis26 ra-tls-secret-prov]# ldd /usr/lib64/libsecret_prov_verify_dcap.so
linux-vdso.so.1 (0x00007ffd507b2000)
libsgx_util.so => /usr/local/lib64/libsgx_util.so (0x00007fbe66cb1000)
libsgx_dcap_quoteverify.so.1 => /lib64/libsgx_dcap_quoteverify.so.1 (0x00007fbe66a8c000)
libpthread.so.0 => /lib64/libpthread.so.0 (0x00007fbe6686c000)
libc.so.6 => /lib64/libc.so.6 (0x00007fbe664a7000)
libcurl.so.4 => /lib64/libcurl.so.4 (0x00007fbe66219000)
libdl.so.2 => /lib64/libdl.so.2 (0x00007fbe66015000)
libstdc++.so.6 => /lib64/libstdc++.so.6 (0x00007fbe65c80000)
libm.so.6 => /lib64/libm.so.6 (0x00007fbe658fe000)
libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00007fbe656e6000)
/lib64/ld-linux-x86-64.so.2 (0x00007fbe6717c000)
libnghttp2.so.14 => /lib64/libnghttp2.so.14 (0x00007fbe654bf000)
libidn2.so.0 => /lib64/libidn2.so.0 (0x00007fbe652a1000)
libssh.so.4 => /lib64/libssh.so.4 (0x00007fbe65032000)
libpsl.so.5 => /lib64/libpsl.so.5 (0x00007fbe64e21000)
libssl.so.1.1 => /lib64/libssl.so.1.1 (0x00007fbe64b8d000)
libcrypto.so.1.1 => /lib64/libcrypto.so.1.1 (0x00007fbe646a4000)
libgssapi_krb5.so.2 => /lib64/libgssapi_krb5.so.2 (0x00007fbe6444f000)
libkrb5.so.3 => /lib64/libkrb5.so.3 (0x00007fbe64165000)
libk5crypto.so.3 => /lib64/libk5crypto.so.3 (0x00007fbe63f4e000)
libcom_err.so.2 => /lib64/libcom_err.so.2 (0x00007fbe63d4a000)
libldap-2.4.so.2 => /lib64/libldap-2.4.so.2 (0x00007fbe63afb000)
liblber-2.4.so.2 => /lib64/liblber-2.4.so.2 (0x00007fbe638eb000)
libbrotlidec.so.1 => /lib64/libbrotlidec.so.1 (0x00007fbe636de000)
libz.so.1 => /lib64/libz.so.1 (0x00007fbe634c6000)
libunistring.so.2 => /lib64/libunistring.so.2 (0x00007fbe63145000)
librt.so.1 => /lib64/librt.so.1 (0x00007fbe62f3d000)
libkrb5support.so.0 => /lib64/libkrb5support.so.0 (0x00007fbe62d2c000)
libkeyutils.so.1 => /lib64/libkeyutils.so.1 (0x00007fbe62b28000)
libresolv.so.2 => /lib64/libresolv.so.2 (0x00007fbe62911000)
libsasl2.so.3 => /lib64/libsasl2.so.3 (0x00007fbe626f3000)
libbrotlicommon.so.1 => /lib64/libbrotlicommon.so.1 (0x00007fbe624d2000)
libselinux.so.1 => /lib64/libselinux.so.1 (0x00007fbe622a8000)
libcrypt.so.1 => /lib64/libcrypt.so.1 (0x00007fbe6207f000)
libpcre2-8.so.0 => /lib64/libpcre2-8.so.0 (0x00007fbe61dfb000)
Ok, and please the last one: ldd /usr/local/lib64/libsgx_util.so
Sure, here is the output,
[root@zis26 ~]# ldd /usr/local/lib64/libsgx_util.so
linux-vdso.so.1 (0x00007ffcea9fa000)
libcurl.so.4 => /lib64/libcurl.so.4 (0x00007f7651c2b000)
libc.so.6 => /lib64/libc.so.6 (0x00007f7651866000)
libnghttp2.so.14 => /lib64/libnghttp2.so.14 (0x00007f765163f000)
libidn2.so.0 => /lib64/libidn2.so.0 (0x00007f7651421000)
libssh.so.4 => /lib64/libssh.so.4 (0x00007f76511b2000)
libpsl.so.5 => /lib64/libpsl.so.5 (0x00007f7650fa1000)
libssl.so.1.1 => /lib64/libssl.so.1.1 (0x00007f7650d0d000)
libcrypto.so.1.1 => /lib64/libcrypto.so.1.1 (0x00007f7650824000)
libgssapi_krb5.so.2 => /lib64/libgssapi_krb5.so.2 (0x00007f76505cf000)
libkrb5.so.3 => /lib64/libkrb5.so.3 (0x00007f76502e5000)
libk5crypto.so.3 => /lib64/libk5crypto.so.3 (0x00007f76500ce000)
libcom_err.so.2 => /lib64/libcom_err.so.2 (0x00007f764feca000)
libldap-2.4.so.2 => /lib64/libldap-2.4.so.2 (0x00007f764fc7b000)
liblber-2.4.so.2 => /lib64/liblber-2.4.so.2 (0x00007f764fa6b000)
libbrotlidec.so.1 => /lib64/libbrotlidec.so.1 (0x00007f764f85e000)
libz.so.1 => /lib64/libz.so.1 (0x00007f764f646000)
libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f764f426000)
/lib64/ld-linux-x86-64.so.2 (0x00007f765214a000)
libunistring.so.2 => /lib64/libunistring.so.2 (0x00007f764f0a5000)
librt.so.1 => /lib64/librt.so.1 (0x00007f764ee9d000)
libdl.so.2 => /lib64/libdl.so.2 (0x00007f764ec99000)
libkrb5support.so.0 => /lib64/libkrb5support.so.0 (0x00007f764ea88000)
libkeyutils.so.1 => /lib64/libkeyutils.so.1 (0x00007f764e884000)
libresolv.so.2 => /lib64/libresolv.so.2 (0x00007f764e66d000)
libsasl2.so.3 => /lib64/libsasl2.so.3 (0x00007f764e44f000)
libm.so.6 => /lib64/libm.so.6 (0x00007f764e0cd000)
libbrotlicommon.so.1 => /lib64/libbrotlicommon.so.1 (0x00007f764deac000)
libselinux.so.1 => /lib64/libselinux.so.1 (0x00007f764dc82000)
libcrypt.so.1 => /lib64/libcrypt.so.1 (0x00007f764da59000)
libpcre2-8.so.0 => /lib64/libpcre2-8.so.0 (0x00007f764d7d5000)
Unfortunately, I don't see anything special. The only way forward I see is to debug with GDB and get to the bottom of the -15104
error.
Hm, this error code is:
mbedtls_ssl_handshake() failed: -0x3b00 (-15104): PK - The pubkey tag or value is invalid (only RSA and EC are supported)
@dimakuv Is this error example-specific or does it have a dependency on DCAP configuration ?
@tanishq-akula I feel like this error is example-specific. It doesn't seem to have to do anything with DCAP infrastructure/config.
@tanishq-akula: Does this problem still occur on the current RHEL? Is there anything we should change/fix in Gramine, or can I close this issue?
Hi @mkow, I haven't had the chance to explore this further. I appreciate your assistance and support. It’s okay to close this issue for now.
Description of the problem
Hi,
While running the command
gramine-sgx ./client
, the following error was encountered:-System Specs:- RHEL 8.6, Kernel:- 4.18.0-348.20.1.el8_5.x86_64
Thanks
Steps to reproduce
No response
Expected results
No response
Actual results
No response
Gramine commit hash
e18bc05b17fd704b259cb0401f928dc4ec5199a6