Closed hoangcuongnguyen2001 closed 6 months ago
You'll need to drop py_torch_model.bin
and config.json
into the directory /tram/data/ml-models/bert_model
. You can look at the Dockerfile to see how we do this for the official TRAM docker image. Then restart the Docker container.
Thank you! Just to be sure about your suggestions, do I need to create my own URLs for pytorch_model.bin
and config.json
, or do I just need to restart the Docker container as in TRAM installation guide (docker-compose up
)? I looked already in your Dockerfile and there are the URLs for both these files, so I am not sure about whether creating URLs for my fine-tuned SciBERT files would be necessary or not.
Also, I just have another question, where did your team get the dataset of CTI sentences to train SciBERT? Is the dataset mined from the knowledge base of MITRE ATT&CK framework?
@hoangcuongnguyen2001 if you are only trying to test the performance of the models that you have trained, and you're not yet trying to use the models as though they are finished, I would not recommend trying to install them in the tool itself. It would be easier to evaluate their performance using a program similar to the last cells of the two fine-tuning notebooks.
That said, if you want to install the models in the UI tool, you can edit the Dockerfile to copy the model binary and its config file into the image from your local filesystem. The model binary needs to be copied to /tram/data/ml-models/bert_model/pytorch_model.bin
and the config file needs to be copied to /tram/data/ml-models/bert_model/config.json
. You can then delete the lines that download our models into the image (which are these lines). For these changes to take effect, you would need to re-build the image, and then start the docker compose again with the new image.
You may also need to delete lines 65 to 79, as these are for installing certificates that are specific to our environment.
Closing due to inactivity.
Hi there,
Firstly, I would like to thank your teams for making sure that users like me could fine-tune the SciBERT model that TRAM used, with my own data. I just finished doing that in my own Colab notebook though.
However, I would like to load my own SecBERT model to TRAM, for testing the performance of my model against new threat reports. Do you have any suggestions about how can I do it locally?