Narasimha1997 / smartreply

Unofficial port of Google's smart reply runtime (powers gmail and assistant) model to python, allowing developers to leverage intelligent smart reply as an API in Web and embedded systems that supports Linux, a loader (ld.so), a fully POSIX C++ Runtime and Python interpreter.
MIT License
56 stars 9 forks source link

Great work! Few queries regarding the model #4

Open chiranshu14 opened 3 years ago

chiranshu14 commented 3 years ago

Hello Narasimha,

You have done an awesome job of creating this python port. I have been trying to find ways to run the model on desktop since two days and had no luck. This worked perfectly.

This is not really an issue but a few queries -

  1. The model seems to be working "fine" and not as good as it works on the android, do you think that they use a much more advanced version? If yes, how can we improve this?

  2. Their model sends some fallback responses if it couldn't suggest anything (Ok ,yes , no (y), :), etc.) Have you added any conditions to avoid that? your code doesnt reply anything when it cant find something.

  3. Do you know what the numbers mean? They definitely do not look like probabilities as it is going above 1. What would be a good condition to ignore a suggestion. (Maybe ignore all the suggestions less than 0.5 score?, what do you think ) image

  4. Since this model was published in 2019, it is bit outdated. Do you know any other latest research/model that is readily available for use?

Appreciate your help! Thanks again for your great work.

Narasimha1997 commented 3 years ago

@chiranshu14 Hi, thanks for opening the issue. And thanks for appreciating my small work.

  1. The model is not exactly same as what Google uses on their production system. This is the port of open source version they released. So it's not that accurate.
  2. Yes, I haven't added that feature. But it can be added. Thanks for suggesting. If you can contribute that feature I would be happy to merge that.
  3. Maybe you can ignore suggestions with probability less than 0.5 or you can set your custom threshold based on the model's performance, since the model is not so good at predicting, you can set lower threshold values to make sure you get somewhat accurate results.
  4. NLP has evolved very much and of course there are accurate language models these days, but make sure they are lighter as well. One suggestion would be to use BERT-lite and derive nearest responses from word embeddings. If I find any good alternatives, I'll update this comment for sure. Thanks

I'm thinking of re-designing this project soon, to make it more easier to install and work with, also I am planning to support multiple python versions, with proper CI/CD pipelines and unit-tests. Two years back, I was in a hurry to finish the project so didn't bother much about clean-design. Basically I wasn't much experienced at that time. Now I have clear ideas what to do.

chiranshu14 commented 3 years ago

Sounds good, it's unfortunate that we can't just retrain google's existing model with more data to improve it. Your suggestion about BERT sounds good to me, I'll also check if there is a better option to implement this. I will update here when I have something solid to share.

Thanks!

chiranshu14 commented 3 years ago

@Narasimha1997 Can you please tell me how did you get the header files for the custom ops used in the tflite model? Are they from the tensorflow git repo? or did you build them by yourself? Thanks!

Narasimha1997 commented 3 years ago

Hi, @chiranshu14 The Custom ops are taken from Tensorflow lite smart reply example for android.

You can have a look at it here: https://www.github.com/tensorflow/examples/tree/master/lite%2Fexamples%2Fsmart_reply%2Fandroid%2Fapp

chiranshu14 commented 3 years ago

Thanks, I'm currently trying to use the tflite model from the ML kit website. Tried their android sample, it is much better than the one from the TensorFlow site. Surprisingly they have not kept the header files in the repo for that project. I guess I will end up at a dead end. Will keep you posted!

Narasimha1997 commented 3 years ago

@chiranshu14 Hi, I have created a new repository with much cleaner code base and CMake powered build system with automate release system. You can check the repository here :

py-smartreply

Looking forward for your review and contributions.

chiranshu14 commented 3 years ago

Great, thanks! I'll have a look.

For my application I have decided to go with dialoGPT model. It's got much better and smarter responses than Google's model. Check that out if you're interested. Only caveat is that it's a huge model and might get slow for generating inferences. I'm currently trying to distil that model to speed things up.

Q- Which tflite file have you used for this project? One from the tensorflow site, is a dummy model and isn't good. Another model can be found from the firebase android samples. That model gives very high-quality responses (exactly like gmail/gboard, etc.). If you were not able to find the file, let me know I'll share that file with you.

Narasimha1997 commented 3 years ago

@chiranshu14

That would be great. If you want to open source that as a python package, you can let me know. Would be happy to contribute.

Had a brief look at dialoGPT. It's too huge I think, won't be suitable for embedded systems with minimal Memory and Compute capability.

chiranshu14 commented 3 years ago

@Narasimha1997

Sure, distillation is going to take some time. In the meantime I'm also waiting for GPT-3 to be released, which has remarkably high quality.

Quick question- In your repo, which tflite file have you used for this project? One from the tensorflow site, is a dummy model and isn't good. Another model can be found from the firebase android samples. That model gives very high-quality responses (exactly like gmail/gboard, etc.). If you were not able to find the file, let me know I'll share that file with you.

Narasimha1997 commented 3 years ago

@chiranshu14 Sure, you can share the file. I'll check how it can be deployed.

chiranshu14 commented 3 years ago

@Narasimha1997 True, it's not suitable to be deployed directly on edge devices. But I have some hope from the distilled model, using the existing model we can train a smaller model like an LSTM using Teacher-Student knowledge distillation. I'm currently working on that, will keep you posted.

Here is Firebase/MLKit tflite file for Smart reply A quick comparison on Neutron shows how much more complex this model is than the TensorFlow version.

Narasimha1997 commented 3 years ago

@chiranshu14 Thanks, saw that. I guess the model does not depend on any custom plugins like TFlite model. I will try running using python first.