marl / crepe

CREPE: A Convolutional REpresentation for Pitch Estimation -- pre-trained model (ICASSP 2018)
https://marl.github.io/crepe/
MIT License
1.12k stars 159 forks source link

CREPE model Tensorflow on Android #79

Open remyut opened 2 years ago

remyut commented 2 years ago

Hi,

I use the CREPE model on a web browser which works pretty fine, but is there a way to integrate it in Android or flutter web/mobile using Tensorflow library?

the model works just nice when the mic is used from the web browser (with ml5 js for example). I wish to connect another source, like phone mic, is this possible?

Could you please help me to understand?

I believe it should be possible, I do not want to re-train the model it already works fine. I just need to understand how to modify the input audio data from another source to fit the model.

Thanks for your help Remy

martingasser commented 2 years ago

I'm using it right now in an iOS/Android React Native app using TFLite.

You can read up on how to convert the Keras model to TFLite here: https://www.tensorflow.org/lite/convert

I just had to write a small conversion script that builds the Keras model and loads the weights from the h5 files provided in the repo, and then I used the tf.lite.TFLiteConverter.from_keras_model(...) API.

remyut commented 2 years ago

Thats great I’ll check it out, thanks for the help

Remy

On Thu, 14 Apr 2022 at 3:09 PM, martingasser @.***> wrote:

I'm using it right now in an iOS/Android React Native app using TFLite.

You can read up on how to convert the Keras model to TFLite here: https://www.tensorflow.org/lite/convert

I just had to write a small conversion script that builds the Keras model and loads the weights from the h5 files provided in the repo, and then I used the tf.lite.TFLiteConverter.from_keras_model(...) API.

— Reply to this email directly, view it on GitHub https://github.com/marl/crepe/issues/79#issuecomment-1098780660, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALCW7IE644XZLO6FEC6KWM3VE7ABJANCNFSM5MVT37AA . You are receiving this because you authored the thread.Message ID: @.***>

RemyNtshaykolo commented 2 years ago

I'm using it right now in an iOS/Android React Native app using TFLite.

You can read up on how to convert the Keras model to TFLite here: https://www.tensorflow.org/lite/convert

I just had to write a small conversion script that builds the Keras model and loads the weights from the h5 files provided in the repo, and then I used the tf.lite.TFLiteConverter.from_keras_model(...) API.

Hey, were you able to load a model bigger than the tiny one? The bigger model is the more latence I have during the inference

martingasser commented 2 years ago

I'm using it right now in an iOS/Android React Native app using TFLite. You can read up on how to convert the Keras model to TFLite here: https://www.tensorflow.org/lite/convert I just had to write a small conversion script that builds the Keras model and loads the weights from the h5 files provided in the repo, and then I used the tf.lite.TFLiteConverter.from_keras_model(...) API.

Hey, were you able to load a model bigger than the tiny one? The bigger model is the more latence I have during the inference

Of course, bigger models need more computation. On a low-end Android device (Samsung Galaxy A12), I can only run the tiny model in real time. On an iPhone 11, I can use the "small" model without problems.

Which device are you using?