Closed csdf-ssm closed 2 years ago
@csdf-ssm , We see that you are using tf version 1.13, 1.x is not actively supported, please update to latest stable tf v2.7 and let us know if you are facing same issue.
I had download tensorflow.git and use demo in /tensorflow/tensorflow/lite/examples/ios/camera directory ,run pod and it pod.lock PODS:
DEPENDENCIES:
SPEC REPOS: trunk:
SPEC CHECKSUMS: TensorFlowLite: 8b9dc4eb32eac0f8cb660c66bca7604da56dcc5a
PODFILE CHECKSUM: 1f44a2ab814725d3675508d4a66a5efdb4140d58
COCOAPODS: 1.11.2
then I check my pods version and it shows last version is 1.13.1 -> TensorFlowLite (1.13.1) TensorFlow Lite pod 'TensorFlowLite', '~> 1.13.1'
@csdf-ssm , We see that you are using tf version 1.13, 1.x is not actively supported, please update to latest stable tf v2.7 and let us know if you are using same issue.
@tilakrayal
I had download tensorflow.git and use demo in /tensorflow/tensorflow/lite/examples/ios/camera directory ,run pod and it pod.lock
PODS:
TensorFlowLite (1.13.1) DEPENDENCIES:
TensorFlowLite (= 1.13.1) SPEC REPOS: trunk:
SPEC CHECKSUMS: TensorFlowLite: 8b9dc4eb32eac0f8cb660c66bca7604da56dcc5a
PODFILE CHECKSUM: 1f44a2ab814725d3675508d4a66a5efdb4140d58
COCOAPODS: 1.11.2
then I check my pods version and it shows last version is 1.13.1 -> TensorFlowLite (1.13.1) TensorFlow Lite pod 'TensorFlowLite', '~> 1.13.1'
Homepage: https://www.tensorflow.org/lite/ Source: https://dl.google.com/dl/cpdc/a3cc8a8fb2aec8f6/TensorFlowLite-1.13.1.tar.gz Versions: 1.13.1, 1.12.0, 1.11.0, 1.10.1, 1.10.0, 1.9.0, 0.1.7, 0.0.3, 0.0.2 [master repo]
@tilakrayal and I also download tensorflowSwift demo ImageClassification run success but I do not want load model.tllite filepath to creat Model,You know I mean? show code:
interpreter = try Interpreter.init(modelPath: modelPath, options: options, delegates: nil)
and Interpreter Class public init(modelPath: String, options: Options? = nil, delegates: [Delegate]? = nil) throws { guard let model = Model(filePath: modelPath) else { throw InterpreterError.failedToLoadModel } guard let cInterpreterOptions = TfLiteInterpreterOptionsCreate() else { throw InterpreterError.failedToCreateInterpreter } and in Model Class : /// - filePath: The local file path to a TensorFlow Lite model. init?(filePath: String) { guard !filePath.isEmpty, let cModel = TfLiteModelCreateFromFile(filePath) else { return nil } self.cModel = cModel } this guard !filePath.isEmpty, let cModel = TfLiteModelCreateFromFile(filePath) else { return nil }
i want to use that method
// Returns a model from the provided buffer, or null on failure.
TFL_CAPI_EXPORT extern TfLiteModel TfLiteModelCreate(const void model_data,
size_t model_size);
but run error ,I do not know where I have make a mistake;
in my demo
let data = try! Data.init(contentsOf: URL.init(fileURLWithPath: modelPath))
var bytes = UInt8
let unsafeMutableRawBufferPointer = bytes.withUnsafeMutableBytes { $0 }
let unsafeMutablePointer = unsafeMutableRawBufferPointer.baseAddress
this is my question ,sorry The code is a mess,but do you know what i mean? if you can ,can show me a demo to load model with TfLiteModel TfLiteModelCreate(const void model_data, size_t model_size); Thanks!
@csdf-ssm , As commented tf v1.x is not supported actively, please update to 2.x and let us know if you are facing same issue.Thanks
This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you.
Closing as stale. Please reopen if you'd like to work on this further.
error tip : Model provided has model identifier 'AFRG', should be 'TFL3'
I have model.tflite when I creat
model = tflite::FlatBufferModel::BuildFromBuffer(newmodel_data, modellength);
How Can I do?