Closed thempp66 closed 3 days ago
First, something inside Python does open a disallowed-in-manifest file:
(pal_files.c:108:file_open) warning: Disallowing access to file '0fb3c695d8b04d5c9f236bd03447d134.onnx'; file is not trusted or allowed.
(libos_parser.c:1658:buf_write_all) [P1:T1:python3.8] trace: ---- openat(AT_FDCWD, "0fb3c695d8b04d5c9f236bd03447d134.onnx", O_WRONLY|O_CREAT|O_TRUNC|0x80000, 0666) = -13
Second, we need more info:
.onnx
files are?First, something inside Python does open a disallowed-in-manifest file:
(pal_files.c:108:file_open) warning: Disallowing access to file '0fb3c695d8b04d5c9f236bd03447d134.onnx'; file is not trusted or allowed. (libos_parser.c:1658:buf_write_all) [P1:T1:python3.8] trace: ---- openat(AT_FDCWD, "0fb3c695d8b04d5c9f236bd03447d134.onnx", O_WRONLY|O_CREAT|O_TRUNC|0x80000, 0666) = -13
Second, we need more info:
How exactly do you run that Python script?
Can you attach the whole Gramine log?
Do you know what these
.onnx
files are?How exactly do you run that Python script?
git clone the gramine repo and change to v1.7
create a file named test-ml.py under dir
gramine/CI-Examples/python/scripts/test-ml.py
(the code is shown above) 3.cd gramine/CI-Examples/python/
andmake SGX=1
4.gramine-sgx ./python scripts/test-ml.py
Can you attach the whole Gramine log? Sure. log.txt
Do you know what these
.onnx
files are? onnx is a model file in machine learning. I am not sure but I guess it probably generate a temp file .onnx and save to somewhere when running line15model.fit(X_train, y_train)
. And the file may read in the following codes.
Thanks @thempp66 for good pointers!
Ok, looking at the source code, we observe this: https://github.com/microsoft/hummingbird/blob/d489151e97eaa9d8ec446118b709863a5acab87d/hummingbird/ml/_topology.py#L248-L255
This means that this .onnx
file name is generated pseudo-randomly by default:
if output_model_name is None:
output_model_name = str(uuid4().hex) + ".onnx"
Now we just need to specify a concrete name, this is achieved by something called "extra config":
if constants.ONNX_OUTPUT_MODEL_NAME in extra_config:
onnx_model_name = extra_config[constants.ONNX_OUTPUT_MODEL_NAME]
output_model_name = onnx_model_name + ".onnx"
At this point I don't know what this ONNX_OUTPUT_MODEL_NAME
config is, and how to specify it. I only see that it's somehow possible, based on this: https://github.com/microsoft/hummingbird/blob/d489151e97eaa9d8ec446118b709863a5acab87d/hummingbird/ml/supported.py#L532-L533
So please figure out how to specify a concrete name here. Then you won't have that pseudo-random 3bdebf8421c9448fbdb2245a965f847a.onnx
stuff.
And finally, you'll be able to specify something like this in your ML script:
# I actually don't know how it is done, please check the Hummingbird docs
ONNX_OUTPUT_MODEL_NAME = "/some/path/with/models/mymodel.onnx"
Now you can instruct Gramine to allow this file. Add in your Gramine manifest file smth like this:
fs.mounts = [
...
{ path = "/some/path/with/models/", uri = "file:/some/path/with/models/" },
]
sgx.allowed_files = [
...
"file:/some/path/with/models/",
]
Thanks @thempp66 for good pointers!
Ok, looking at the source code, we observe this: https://github.com/microsoft/hummingbird/blob/d489151e97eaa9d8ec446118b709863a5acab87d/hummingbird/ml/_topology.py#L248-L255
This means that this
.onnx
file name is generated pseudo-randomly by default:if output_model_name is None: output_model_name = str(uuid4().hex) + ".onnx"
Now we just need to specify a concrete name, this is achieved by something called "extra config":
if constants.ONNX_OUTPUT_MODEL_NAME in extra_config: onnx_model_name = extra_config[constants.ONNX_OUTPUT_MODEL_NAME] output_model_name = onnx_model_name + ".onnx"
At this point I don't know what this
ONNX_OUTPUT_MODEL_NAME
config is, and how to specify it. I only see that it's somehow possible, based on this: https://github.com/microsoft/hummingbird/blob/d489151e97eaa9d8ec446118b709863a5acab87d/hummingbird/ml/supported.py#L532-L533So please figure out how to specify a concrete name here. Then you won't have that pseudo-random
3bdebf8421c9448fbdb2245a965f847a.onnx
stuff.And finally, you'll be able to specify something like this in your ML script:
# I actually don't know how it is done, please check the Hummingbird docs ONNX_OUTPUT_MODEL_NAME = "/some/path/with/models/mymodel.onnx"
Now you can instruct Gramine to allow this file. Add in your Gramine manifest file smth like this:
fs.mounts = [ ... { path = "/some/path/with/models/", uri = "file:/some/path/with/models/" }, ] sgx.allowed_files = [ ... "file:/some/path/with/models/", ]
Thank you so much! I will try this.
Thanks, @dimakuv . It's working now.
Update /usr/local/lib/python3.8/dist-packages/concrete/ml/sklearn/base.py:1681
and set extra_config={"onnx_target_opset": OPSET_VERSION_FOR_ONNX_EXPORT, "onnx_model_name": "test"},
And then add allowed_files to manifest.
sgx.allowed_files = [
"file:test.onnx",
"file:.artifacts/environment.txt",
"file:.artifacts/requirements.txt",
...
]
By the way, There are so many tmp files generating when runing. So I was wondering if we could use some regular expression in sgx.allowed_files
?
By the way, There are so many tmp files generating when runing. So I was wondering if we could use some regular expression in
sgx.allowed_files
?
Well, we (or at least I) were thinking about this at some point. But we never had a pressing need to implement regular expressions in Gramine manifest.
Typically these tmp files are generated under some directory. So you can specify the whole directory in sgx.allowed_files
, and then you don't care about the particular names of files.
Even better, for tmp files you don't use sgx.allowed_files
, but you create these files inside the SGX enclave itself. Gramine supports this using the tmpfs
FS mount type: https://gramine.readthedocs.io/en/stable/manifest-syntax.html#fs-mount-points. One example can be found here: https://github.com/gramineproject/gramine/blob/ac61ae1777986e88629b764a1d6a8d6884ea869e/CI-Examples/python/python.manifest.template#L31
OK, thanks again, I will close this issue.
Description of the problem
Perrmission deined when open file. But it seems that my code dose not open any file. Could you give me some tips to solve this problem?
Steps to reproduce
1.download and use gramine v1.7 docker image 2.install pip 3.install lib install pip install -U pip wheel setuptools && pip install concrete-ml 4.make and run the demo
Expected results
Actual results
Gramine commit hash
10e93534169802be16fc9e2b3e9ac70d08efcb41