Open SitiNurAin opened 9 months ago
At the moment, our server is offline which impacts the app's ability to perform translations.
We don't plan on making the server live any time soon. However, you still have the opportunity to experience and test our sign language translation technology. To facilitate this, we recommend running the code locally on your machine.
We have a Python notebook that provides a straightforward way to test our models in real time. You can find the notebook at the following link: Real-time Testing Notebook
may i know how to put it in the apps?
I'm not entirely sure I understand your question. Could you provide more context?
Are you inquiring about integrating the model into your application? If that's the case, could you share more specifics about your project?
basically im doing the same project as you guys , but the detection will betranslated in my language which is malay language.
So in this case , i want to know how you guys do this detection and put the detection sign language in Android Studio. :) hehe
I'm thrilled to hear about your interest in adapting it for Malay!
Our project, as you've seen leverages deep learning models to detect and interpret sign language, and we've integrated this functionality into an Android app using Android Studio. Do keep in mind that we haven't implemented an on-device inferencing approach but rather we went with having our models run the cloud.
if you are interested, here's a broad overview of how we approached the sign language detection and integration:
What you need to do To support the Malay language fully, you would not only translate the app's UI but also ensure that the sign language models are trained on signs specific to the Malay sign language/BIM
For a detailed guide on setting up the cloud-based architecture and integrating it with your Android app, you might want to explore the README file in our repository, where we outline the necessary steps. Additionally, familiarizing yourself with cloud services that support machine learning model hosting and Android's networking capabilities will be crucial.
If you have any questions or need assistance, feel free to reach out. We're excited to see your project come to life and the positive impact it will have on the Malay-speaking community!
Wow so much information<3, yea i do have more question. may i know what software you guys use for this detection? and also the server?
For the server, we utilized Flask. Regarding "software for detection," note that we didn't employ a specific "software" for this, as the detection functionality is implemented within the codebase itself and it's written in python.
Hello, you said that u guys using flask for the server but i see that you guys using azure cloud vm. Is there any different btween both of the server?
hello, may i ask one last question. how to deploy it on azure cloud VM? can you give step by step how?
Why the camera didnt translate the sign language i was doing??