Open yogitavm opened 4 days ago
Hi @yogitavm! Does the issue occur with the provided example as well?
@MVYaroshenko, yes. This is the script I'm using:
#include <iostream>
#include <vector>
#include <string>
#include "GLiNER/gliner_config.hpp"
#include "GLiNER/processor.hpp"
#include "GLiNER/decoder.hpp"
#include "GLiNER/model.hpp"
#include "GLiNER/tokenizer_utils.hpp"
int main() {
gliner::Config config{12, 512}; // Set your max_width and max_length
gliner::WhitespaceTokenSplitter splitter;
auto blob = gliner::LoadBytesFromFile("/path/to/onnx/gliner/model/tokenizer.json");
// Create the tokenizer
auto tokenizer = Tokenizer::FromBlobJSON(blob);
// Create Processor and SpanDecoder
gliner::SpanProcessor processor(config, *tokenizer, splitter);
gliner::SpanDecoder decoder(config);
// Create Model
gliner::Model model("/path/to/onnx/gliner/model/model.onnx", config, processor, decoder);
// A sample input
std::vector<std::string> texts = {"Kyiv is the capital of Ukraine."};
std::vector<std::string> entities = {"city", "country", "river", "person", "car"};
auto output = model.inference(texts, entities);
std::cout << "\nTest Model Inference:" << std::endl;
for (size_t batch = 0; batch < output.size(); ++batch) {
std::cout << "Batch " << batch << ":\n";
for (const auto& span : output[batch]) {
std::cout << " Span: [" << span.startIdx << ", " << span.endIdx << "], "
<< "Class: " << span.classLabel << ", "
<< "Text: " << span.text << ", "
<< "Prob: " << span.prob << std::endl;
}
}
return 0;
}
Are you building from inside the example folder?
Yes, I'm running the script from inside the example folder.
Try removing the build directories in both /GLiNER.cpp and /GLiNER.cpp/examples. Then, build only within /GLiNER.cpp/examples as follows: cmake -D ONNXRUNTIME_ROOTDIR="/home/usr/onnxruntime-linux-x64-1.19.2" -S . -B build cmake --build build --target inference
Hi,
I was trying to run the example inference.cpp with the path updated to my onnx gliner model, but got this error:
How do I resolve this issue?
Thanks!